# Slipform
Project description
🏗 slipform
pythonflow decorator for generating dataflow graphs from raw python.
Why?
-
Syntax is natural, you can use a simple decorator to obtain the dataflow graph. No need to rewrite your code for pythonflow.
-
Slipform allows you to write and test code as you normally would, debugging it using the debugger of your choice during runtime. When you are happy with your code, finally, at runtime you can generate the dataflow graph.
Disclaimer
Slipform was born out of a desire to learn more about the python AST, and potentially use it for my own personal projects if it works out.
It is not actively developed, nor should it be considered stable.
Roadmap
Priority
- naming from assignments
- placeholders from args
- constant support
- functions to operations
- import support
- custom operation support
- ignore comments
#[ignore]
or#[slipform:ignore]
Investigate
- module / import detection from function scope
- sequences (map, list, tuple, zip, sum, filter)
- for loop replacement?
- conditional expression replacement?
- assertion replacement?
- try/catch replacement?
- explicit dependencies?
Examples based on Using Pythonflow
- Get started by importing slipform
from slipform import slipform
# "pf" must be part of scope of any @slipfrom annotated
# function. This limitation will be relaxed in future.
import pythonflow as pf
- A simple example is as follows:
@slipform
def add_graph(x):
a = 5
b = 32
z = a + b + x
With the equivalent pythonflow version:
with pf.Graph() as add_graph:
x = pf.placeholder('x')
a = pf.constant(5, name='a')
b = pf.constant(32, name='b')
z = (a + b + x).set_name('z')
We can evaluate the graphs like usual using pythonflow:
add_graph(['b', 'z'], x=5)
>>> (32, 42)
- A more complicated example
@slipfrom()
def vae(x, x_target, encoder, decoder, mse):
# import ... from ... (as ...) are all supported
import torch
import torch.nn.functional as F
# get the encoding!
z_params = encoder(x)
# deterministic, we dont reparameterize here
z = z_params.z_mean
# reconstruct here
x_pre_recon = decoder(z)
# final activation
x_recon = x_pre_recon if mse else torch.sigmoid(x_pre_recon)
# compute loss
loss = F.mse_loss(x_recon, x_target) if mse else F.binary_cross_entropy_with_logits(x_pre_recon, x_target)
The above will generate code equivalent in functionality to:
with pf.Graph() as add_graph:
x = pf.placeholder('x')
x_target = pf.placeholder('x_target')
encoder = pf.placeholder('encoder')
decoder = pf.placeholder('decoder')
mse = pf.placeholder('mse')
# import everything
torch = pf.import_('torch')
F = pf.import_('torch.nn.functional')
# get the encoding!
z_params = encoder(x)
z_params.set_name('z_params')
# deterministic, we dont reparameterize here
z = z_params.z_mean
z.set_name('z')
# reconstruct here
x_pre_recon = decoder(z)
x_pre_recon.set_name('x_pre_recon')
# final activation
x_recon = pf.conditional(mse, x_pre_recon, torch.sigmoid(x_pre_recon))
x_recon.set_name('x_recon')
# compute loss
loss = pf.conditional(mse, F.mse_loss(x_recon, x_target), F.binary_cross_entropy_with_logits(x_pre_recon, x_target))
loss.set_name('loss')
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file slipform-0.0.1a2.tar.gz
.
File metadata
- Download URL: slipform-0.0.1a2.tar.gz
- Upload date:
- Size: 13.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: python-requests/2.22.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 09ae4342e6350cbfeb281ca5a3af903b0a1a4b36bba171cf12daa7f17bed2843 |
|
MD5 | d9c8e4c37bc634fecc0807e202087497 |
|
BLAKE2b-256 | 2d4fda4afdec78d378f06aa9870601968fd176cfc1cf47046ea8711435d87650 |
File details
Details for the file slipform-0.0.1a2-py2.py3-none-any.whl
.
File metadata
- Download URL: slipform-0.0.1a2-py2.py3-none-any.whl
- Upload date:
- Size: 7.9 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: python-requests/2.22.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b11a2562c71215a65062691c8bdecaf7f13b2ed67d8ee81d6ef5ab205ccfd178 |
|
MD5 | 8be21a06ddd54fcf6087ed673ee0c730 |
|
BLAKE2b-256 | 9bd2041466b3af5723c593002883cf7e3874d5c6c6d937832624ac8411cf1dd3 |