Simple gradient computation library in Python
Project description
NumGrad
Simple gradient computation library for Python.
Getting Started
pip install numgrad
Inspired by tensorflow, numgrad supports automatic differentiation in tensorflow v2 style using original numpy and scipy functions.
>>> import numgrad as ng
>>> import numpy as np # Original numpy
>>>
>>> # Pure numpy function
>>> def tanh(x):
... y = np.exp(-2 * x)
... return (1 - y) / (1 + y)
...
>>> x = ng.Variable(1)
>>> with ng.Graph() as g:
... # numgrad patches numpy functions automatically here
... y = tanh(x)
...
>>> g.backward(y, [x])
(0.419974341614026,)
>>> (tanh(1.0001) - tanh(0.9999)) / 0.0002
0.41997434264973155
numgrad also supports jax style automatic differentiation.
>>> import numgrad as ng
>>> import numpy as np # Original numpy unlike `jax`
>>>
>>> power_derivatives = [lambda a: np.power(a, 5)]
>>> for _ in range(6):
... power_derivatives.append(ng.grad(power_derivatives[-1]))
...
>>> [f(2) for f in power_derivatives]
[32, 80.0, 160.0, 240.0, 240.0, 120.0, 0.0]
>>> [f(-1) for f in power_derivatives]
[-1, 5.0, -20.0, 60.0, -120.0, 120.0, -0.0]
Contribute
Be sure to run the following command before developing
$ git clone https://github.com/ctgk/numgrad.git
$ cd numgrad
$ pre-commit install
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
numgrad-0.2.0.tar.gz
(16.6 kB
view details)
File details
Details for the file numgrad-0.2.0.tar.gz.
File metadata
- Download URL: numgrad-0.2.0.tar.gz
- Upload date:
- Size: 16.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5de66e95e8934bf466c0a5845bd10fc8db74f3385d8f7b4bcd753dc81a933847
|
|
| MD5 |
50ce59328d65c52d6628364314f1f27d
|
|
| BLAKE2b-256 |
7114ac2e0cd1e21d9ba6fd53f82e2535ba064bcca2d202f8a0f81c9ffe13f9a6
|