Skip to main content

`beaverpy` is an implementation of PyTorch operators using only NumPy

Project description

beaverpy :beaver:

Description

v0.1.0

beaverpy is an implementation of PyTorch operators using only NumPy.
Implemented operators (their PyTorch equivalents) include the following:

Note 1: [n, c, h, w] format is used

Note 2: Test code that checks for correctness of the implementation is included in respective notebooks and is also available as standalone pytest scripts

Optional parameters supported

  • Conv2Dstride, padding, dilation, groups
  • MaxPool2Dstride, padding, dilation, return_indices
  • Linearbias
  • MSELossreduction
  • CosineSimilaritydim, eps
  • Softmaxdim

How to install?

pip3 install beaverpy

How to use?

Import beaverpy and numpy

import beaverpy as bp
import numpy as np

Following is an example to use Conv2D:

Define input parameters
in_channels = 6 # input channels
out_channels = 4 # output channels
kernel_size = (2, 2) # kernel size

_stride = (2, 1) # stride (optional)
_padding = (1, 3) # padding (optional)
_dilation = (2, 3) # dilation factor (optional)
_groups = 2 # groups (optional)

in_batches = 2 # input batches
in_h = 4 # input height
in_w = 4 # input weight
Create a random input using the input parameters
_input = np.random.rand(in_batches, in_channels, in_h, in_w)
Call an instance of Conv2D with the input parameters
conv2d = bp.Conv2D(in_channels, out_channels, kernel_size, stride = _stride, padding = _padding, dilation = _dilation, groups = _groups)
Perform convolution
_output = conv2d.forward(_input) # perform convolution
In case you wish to provide your own kernel, then define the same and pass it as an argument to forward() :
kernels = []
for k in range(out_channels):
    kernel = np.random.rand(int(in_channels / _groups), kernel_size[0], kernel_size[1]) # define a random kernel based on the kernel parameters
    kernels.append(kernel)
_output = conv2d.forward(_input, kernels) # perform convolution

Following is an example to use MaxPool2D:

Define input parameters
in_channels = 3 # input channels
kernel_size = (6, 6) # kernel size

_stride = (1, 5) # stride (optional)
_padding = (1, 2) # padding (optional)
_dilation = (2, 1) # dilation factor (optional)
_return_indices = True # return max indices (optional)

in_batches = 3 # input batches
in_h = 11 # input height
in_w = 8 # input weight
Create a random input using the input parameters
_input = np.random.rand(in_batches, in_channels, in_h, in_w)
Call an instance of MaxPool2D with the input parameters
maxpool2d = bp.MaxPool2D(kernel_size, stride = _stride, padding = _padding, dilation = _dilation, return_indices = _return_indices)
Perform maxpooling
_output = maxpool2d.forward(_input)

Following is an example to use Linear:

Define input parameters
in_samples = 128 # input samples
in_features = 20 # input features
out_features = 30 # output features
Create a random input using the input parameters
_input = np.random.rand(in_samples, in_features)
Call an instance of Linear with the input parameters
linear = bp.Linear(in_features, out_features)
Apply a linear transformation
_output = linear.forward(_input)
In case you wish to provide your own weights and bias, then define the same and pass them as arguments to forward() :
_weights = np.random.rand(out_features, in_features) # define random weights
_bias = np.random.rand(out_features) # define random bias
_output = linear.forward(_input, weights = _weights, bias_weights = _bias) # apply linear transformation

Following is an example to use MSELoss:

Create a random input and target
dimension = np.random.randint(500) # dimension of the input and target
_input = np.random.rand(dimension) # define a random input of the above dimension
_target= np.random.rand(dimension) # define a random target of the above dimension
Call an instance of MSELoss with the input parameters
mseloss = bp.MSELoss()
Compue MSE loss
_output = mseloss.forward(_input, _target)

Following is an example to use CosineSimilarity:

Create random input
num_dim = np.random.randint(6) + 1 # number of input dimensions
shape = tuple(np.random.randint(5) + 1 for _ in range(num_dim)) # shape of input
_input1 = np.random.rand(*shape) # generate an input based on the dimensions and shape
_input2 = np.random.rand(*shape) # generate another input based on the dimensions and shape
_dim = np.random.randint(num_dim) # dimension along which CosineSimilarity is to be computed (optional)
_eps = np.random.uniform(low = 1e-10, high = 1e-6) # (optional)
        
Call an instance of CosineSimilarity with the input parameters
cosinesimilarity = bp.CosineSimilarity(dim = _dim, eps = _eps)
Compue CosineSimilarity
_output = cosinesimilarity.forward(_input1, _input2)

Following is an example to use ReLU:

Create a random input
_input = np.random.rand(10, 20, 3)
Call an instance of ReLU with the input parameters
relu = bp.ReLU()
Apply ReLU activation
_output = relu.forward(_input)

Following is an example to use Sigmoid:

Create a random input
_input = np.random.rand(10, 20, 3)
Call an instance of Sigmoid with the input parameters
sigmoid = bp.Sigmoid()
Apply Sigmoid activation
_output = sigmoid.forward(_input)

Following is an example to use Softmax:

Create a random input and dimension
_input = np.random.rand(1, 2, 1, 3, 4)
_dim = np.random.randint(len(_input)) # (optional)
Call an instance of Softmax with the input parameters
softmax = bp.Softmax(dim = _dim)
Apply Softmax activation
_output = softmax.forward(_input)

Future work

  • Replace torch.round() with np.allclose() for tests
  • Implement other operators
  • Optimize code
  • For newer builds of this package that are under development and not yet available on PyPI, please visit the GitHub repository

Acknowledgements

This work is being done during my summer internship at DeGirum Corp., Santa Clara.

Using this code in your projects

  • If you are using this code in your projects, please make sure to cite this repository and the author
  • If you find bugs, create a pull request with a description of the bug and the proposed changes
  • Do have a look at the author's webpage for other interesting works!

README last updated on 06/08/2023

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

beaverpy-0.1.0.tar.gz (14.1 kB view details)

Uploaded Source

Built Distribution

beaverpy-0.1.0-py3-none-any.whl (15.8 kB view details)

Uploaded Python 3

File details

Details for the file beaverpy-0.1.0.tar.gz.

File metadata

  • Download URL: beaverpy-0.1.0.tar.gz
  • Upload date:
  • Size: 14.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for beaverpy-0.1.0.tar.gz
Algorithm Hash digest
SHA256 51e3f852ccb51329af6e445685ee6fc7f1c89fa0d8108be018dd4f92a19122d7
MD5 b912e3fd3b3828cc178aea50a279b0f1
BLAKE2b-256 d400e65b4793031538f94f0cb6478a9c87ce08124f22f5fddf6fcbd231f44aa4

See more details on using hashes here.

File details

Details for the file beaverpy-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: beaverpy-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 15.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for beaverpy-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9762b3116dfaec06f09870e2d909c03888ad813693af7799206782237ed5b767
MD5 2e8a6fceda8a9ba7182857272016634b
BLAKE2b-256 8baa888b12c3d4206deda396b472626d63d97316ed9efc0d4390d26fbd935782

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page