Automatic differentiation and generation of Torch/Tensorflow operations with pystencils (https://i10git.cs.fau.de/pycodegen/pystencils)
Project description
pystencils_autodiff
This repo adds automatic differentiation to pystencils.
Installation
Install via pip:
pip install pystencils-autodiff
or if you downloaded this repository using:
pip install -e .
Then, you can access the submodule pystencils.autodiff.
import pystencils.autodiff
Usage
Create a pystencils.AssignmentCollection with pystencils:
import sympy
import pystencils
z, y, x = pystencils.fields("z, y, x: [20,30]")
forward_assignments = pystencils.AssignmentCollection({
z[0, 0]: x[0, 0] * sympy.log(x[0, 0] * y[0, 0])
})
print(forward_assignments)
Subexpressions:
Main Assignments:
z[0,0] ← x_C*log(x_C*y_C)
You can then obtain the corresponding backward assignments:
from pystencils.autodiff import AutoDiffOp, create_backward_assignments
backward_assignments = create_backward_assignments(forward_assignments)
print(backward_assignments)
You can see the derivatives with respective to the two inputs multiplied by the gradient diffz_C of the output z_C.
Subexpressions:
Main Assignments:
\hat{x}[0,0] ← diffz_C*(log(x_C*y_C) + 1)
\hat{y}[0,0] ← diffz_C*x_C/y_C
You can also use the class AutoDiffOp to obtain both the assignments (if you are curious) and auto-differentiable operations for Tensorflow…
op = AutoDiffOp(forward_assignments)
backward_assignments = op.backward_assignments
tensorflow_op = op.create_tensorflow_op(backend='tensorflow_native', use_cuda=True)
… or Torch:
torch_op = op.create_tensorflow_op(backend='torch_native', use_cuda=True)
Test Report and Coverage
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Hashes for pystencils_autodiff-0.3.3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7ab2dc27bd3a173367fcbc6b9a91c5fa85193119706160d1f731d6ae11ab22e4 |
|
MD5 | 5f34e624abc0a03686024f52e2f1eb90 |
|
BLAKE2b-256 | 6ff64e042cb4e28484c531be6e422136f3541c759a7a666f4c7a41120673d880 |