Skip to main content

A Symbolic JAX software

Project description

SymJAX logo

SymJAX: symbolic CPU/GPU/TPU programming Continuous integration doctest license Code style: black

This is an under-development research project, not an official product, expect bugs and sharp edges; please help by trying it out, reporting bugs. Reference docs

What is SymJAX ?

SymJAX is a symbolic programming version of JAX simplifying graph input/output/updates and providing additional functionalities for general machine learning and deep learning applications. From an user perspective SymJAX apparents to Theano with fast graph optimization/compilation and broad hardware support, along with Lasagne-like deep learning functionalities

Why SymJAX ?

The number of libraries topping Jax/Tensorflow/Torch is large and growing by the day. What SymJAX offers as opposed to most is an all-in-one library with diverse functionalities such as

  • dozens of various datasets with clear descriptions and one line import
  • versatile set of functions from ffts, linear algebraic tools, random variables, ...
  • advanced signal processing tools such as multiple wavelet families (in time and frequency domain), multiple time-frequency representations, apodization windows, ...
  • IO utilities to monitor/save/track specific statistics during graph execution through h5 files and numpy, simple and explicit graph saving allowing to save and load models without burden
  • side utilities such as automatic batching of dataset, data splitting, cross-validation, ...

and most importantly, a SYMBOLIC/DECLARATIVE programming environment allowing CONCISE/EXPLICIT/OPTIMIZED computations.

For a deep network oriented imperative library built on JAX and with a JAX syntax check out FLAX.

Examples

import sys
import symjax as sj
import symjax.tensor as T

# create our variable to be optimized
mu = T.Variable(T.random.normal((), seed=1))

# create our cost
cost = T.exp(-(mu-1)**2)

# get the gradient, notice that it is itself a tensor that can then
# be manipulated as well
g = sj.gradients(cost, mu)
print(g)

# (Tensor: shape=(), dtype=float32)

# create the compiled function that will compute the cost and apply
# the update onto the variable
f = sj.function(outputs=cost, updates={mu:mu-0.2*g})

for i in range(10):
    print(f())

# 0.008471076
# 0.008201109
# 0.007946267
# ...

Installation

Make sure to install all the needed GPU drivers (for GPU support, not mandatory) and install JAX as described in this guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

symjax-0.5.0.tar.gz (149.0 kB view details)

Uploaded Source

Built Distribution

symjax-0.5.0-py3-none-any.whl (172.9 kB view details)

Uploaded Python 3

File details

Details for the file symjax-0.5.0.tar.gz.

File metadata

  • Download URL: symjax-0.5.0.tar.gz
  • Upload date:
  • Size: 149.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for symjax-0.5.0.tar.gz
Algorithm Hash digest
SHA256 30444d4f3f4bcb439da756737e06efee85bf1dd70b05d23486f2a645855976c7
MD5 a885744a9a36e67eb7fcba5bf387fa38
BLAKE2b-256 961c8cf20210088ec418a704df33d554ce8b6c56abe2406512d9fdc138234bc8

See more details on using hashes here.

File details

Details for the file symjax-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: symjax-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 172.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for symjax-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b34d82a2290ce4d614d0f6b58b4916751baec9a70166579e53046a1a90d3f763
MD5 2e65a68209cfaffa4c5eb8b38f0edee4
BLAKE2b-256 d16104522a9915d442417195381744e0639190add9139f9cf2d624e7500b3991

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page