Skip to main content

Just another deep learning framework

Project description

GitHub Pipenv locked Python version PyPI GitHub Pipenv locked dependency version (branch) PyPI - License

ReleasesContributingFeatures

MatterIx is a simple deep learning framework built to understand the fundamental concepts such as autodiff, optimizers, loss functions from a first principle method.

Features

MatterIx provide features such as automatic differntiation (autodiff) to compute gradients, optimizers, loss functions, basic modules to create your own neural networks. The core value of matterix is that it is a distilled version of pytorch so it is easier to understand what is happening under the hood.

At its core, matterix uses reverse-mode autodiff to compute gradients. All the computations are representated as a graph of tensors with each tensor holding a reference to a function which can compute the local gradient for the tensor. The calculation of the partial derivative for each node is completed when the entire graph is traversed.

Installation

a. Install it from github
# Install either with option-1 or option-2

# Option-1
pip install git+https://github.com/SiddeshSambasivam/MatterIx.git#egg=MatterIx

# Option-2
git clone https://github.com/SiddeshSambasivam/MatterIx.git

python setup.py install

b. Install from PyPI

# Install directly from PyPI repository
pip install --upgrade matterix

Example usage

import numpy as np
from matterix import Tensor
import matterix.nn as nn
import matterix.functions as F
from matterix.loss import RMSE

x_train, y_train = Tensor(x[:1500]), Tensor(y[:1500])
x_test, y_test = Tensor(x[1500:]), Tensor(y[1500:])

class Model(nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.w1 = Tensor(np.random.randn(1, 150), requires_grad=True)
        self.b1 = Tensor(np.random.randn(1, 150), requires_grad=True)
        self.w2 = Tensor(np.random.randn(150, 1), requires_grad=True)
        self.b2 = Tensor(np.random.randn(1), requires_grad=True)

    def forward(self, x) -> Tensor:
        out_1 = (x @ self.w1) + self.b1
        out_2 = F.sigmoid(out_1)
        output = (out_2 @ self.w2) + self.b2

        return output

model = Model()
optimizer = SGD(model, model.parameters())

EPOCHS = 100
t_bar = trange(EPOCHS)

for i in t_bar:

    optimizer.zero_grad()

    y_pred = model(x_train)

    loss = RMSE(y_train, y_pred)

    loss.backward()

    optimizer.step()
    t_bar.set_description("Epoch: %.0f Loss: %.5f" % (i, loss.data))
    t_bar.refresh()

Take a look at examples for different examples

Development setup

Install the necessary dependecies in a seperate virtual environment

# Create a virtual environment during development to avoid dependency issues
pip install -r requirements.txt

# Before submitting a PR, run the unittests locally
pytest -v

Release history

  • 0.1.0

    • First stable release
    • ADD: Tensor, tensor operations, sigmoid functions
    • FIX: Inaccuracies with gradient computation
  • 0.1.1

    • ADD: Optimizer: SGD
    • ADD: Functions: Relu
    • ADD: Loss functions: RMSE, MSETensor
    • ADD: Module: For defining neural networks
    • FIX: Floating point precision issue when calculating gradient

Contributing

  1. Fork it

  2. Create your feature branch

    git checkout -b feature/new_feature
    
  3. Commit your changes

    git commit -m 'add new feature'
    
  4. Push to the branch

    git push origin feature/new_feature
    
  5. Create a new pull request (PR)


Siddesh Sambasivam Suseela - @ssiddesh45 - plutocrat45@gmail.com

Distributed under the MIT license. See LICENSE for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

MatterIx-0.1.1.tar.gz (11.1 kB view hashes)

Uploaded Source

Built Distribution

MatterIx-0.1.1-py3-none-any.whl (15.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page