Skip to main content

A machine learning framework developed by Neil Jiang.

Project description

pinenut

A deep learning framework, which was developed by Neil Jiang. It supports define-by-run computational graph and automatically calculate the gradients of functions. It supports GPU acceleration. It is powerful, flexible and easy to understand.

Installation

pip install pinenut

Usage

from pinenut import Tensor
import numpy as np

# define a computational graph
x = Tensor(np.array([1, 2, 3]))
y = Tensor(np.array([4, 5, 6]))
z = x + y
z.backward() # calculate the gradients of z with respect to x and y
print(x.grad) # [1, 1, 1]
print(y.grad) # [1, 1, 1]

GPU acceleration

from pinenut import Tensor, Cuda, matmul, as_array

x = Tensor([1, 2, 3])
y = Tensor([4, 5, 6])

cuda_is_available = Cuda.available()
if cuda_is_available:
    x.to_gpu()
    y.to_gpu()
z = matmul(x, y.T)
assert z.data == as_array(32)
print(type(z.data))
z.backward()
assert (x.grad.data == [4, 5, 6]).all()
assert (y.grad.data == [1, 2, 3]).all()

examples

  • [mnist]
import numpy as np
import pinenut.core.datasets as dss
from pinenut import MLP, SGD, relu, softmax
from pinenut import Cuda

def data_transform(x):
    x = x.flatten()
    x = x.astype(np.float32)
    return x / 255.0

train = dss.MNIST(train=True, data_transform=data_transform)
test = dss.MNIST(train=False, data_transform=data_transform)

epochs = 5
batch_size = 100
lr = 0.1 # learning rate

model = MLP([784, 100, 10], hidden_activation=relu, output_activation=softmax)
optimizer = SGD(model, lr)
cuda_is_available = Cuda.available()
model.train(train, epochs, batch_size, optimizer, test, enable_cuda=cuda_is_available)
model.save_weights('mnist_weights.npz')

Features

  • [1] Define-by-run computational graph
  • [2] GPU acceleration
  • [3] Automatic gradient calculation
  • [4] Support for various activation, loss and optimizer functions
  • [5] Pure Python implementation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pinenut-0.1.3.tar.gz (5.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pinenut-0.1.3-py3-none-any.whl (5.5 MB view details)

Uploaded Python 3

File details

Details for the file pinenut-0.1.3.tar.gz.

File metadata

  • Download URL: pinenut-0.1.3.tar.gz
  • Upload date:
  • Size: 5.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for pinenut-0.1.3.tar.gz
Algorithm Hash digest
SHA256 5e6321d3745a1d3f73d2dde97e82d5db4062dc58d322a60d5183f92705c0d74b
MD5 29d8d343d14a253911b8b72f1d7ae2b2
BLAKE2b-256 308ddebf910dd87a0b0e3ec16e18929ea17301d81e555d7167e5e560db5e4779

See more details on using hashes here.

File details

Details for the file pinenut-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: pinenut-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 5.5 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for pinenut-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e7cf1d2f8bac161ce5d0721d05dec83d93d236aecfd588d75e3a0ad5a936617a
MD5 9af1832164b98ee504c53c7524de66e3
BLAKE2b-256 c41b0e716360295e25a12291a4f35c58d81369c890cd67846e9c530d8fe6fb49

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page