Skip to main content

Polynomial, piecewise polynomial, fourier series layers for tensorflow

Project description

Build Status Zenodo

Piecewise Polynomial and Fourier Series Layers for Tensorflow

Tensorflow layers using piecewise Lagrange polynomials with Gauss Lobatto nodes (I'm also adding truncated fourier series and other orthogonal functions). This is a technique commonly used in finite element analysis and means that the weight assigned to each node is exactly the function value at that node. Long ago I wrote a c++ code that explored higher order polynomials in the synapse of a standard neural network here . Here I'm implementing some of that capability in Tensorflow.

Idea

The idea is extremely simple - instead of a single weight at the synapse, use n-weights. The n-weights describe a piecewise polynomial and each of the n-weights can be updated independently. A Lagrange polynomial and Gauss Lobatto points are used to minimize oscillations of the polynomial. The same approach can be applied to any "functional" synapse, and I also have Fourier series synapses in this repo as well. This can be implemented as construction of a polynomial or Fourier kernel followed by a standard tensorflow layer where a linear activation is used.

Why

Using higher order polynomial representations might allow networks with much fewer total weights. In physics, higher order methods can be much more efficient. Spectral and discontinuous galerkin methods are examples of this. Note that a standard neural network with relu activations is piecewise linear. Here there are no bias weights and the "non-linearity" is in the synapse.

In addition, it's well known that the dendrites are also computational units in neurons, for example Dendritic action potentials and computation in human layer 2/3 cortical neurons and this is a simple way to add more computational power into the artificial neural network model.

Installation

pip install high-order-layers

Use

import tensorflow as tf
import high_order_layers.PolynomialLayers as poly
from tensorflow.keras.layers import *
mnist = tf.keras.datasets.mnist

(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = (x_train / 128.0-1.0), (x_test / 128.0-1.0)

units = 20

basis = poly.b3

model = tf.keras.models.Sequential([
  Flatten(input_shape=(28, 28)),
  poly.Polynomial(units, basis=basis, shift=0.0),
  LayerNormalization(),
  poly.Polynomial(units, basis=basis, shift=0.0),
  LayerNormalization(),
  poly.Polynomial(units, basis=basis, shift=0.0),
  LayerNormalization(),
  poly.Polynomial(units, basis=basis, shift=0.0),
  LayerNormalization(),
  Dense(10, activation='softmax')
])

model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.fit(x_train, y_train, epochs=20, batch_size=10)
model.evaluate(x_test, y_test)

Examples

Run the example from the main directory. For the functionExample case run

python functionExample.py
  1. invariant mnist resnet
  2. convolutional neural network mnist
  3. fitting a sin wave
  4. cifar10 convolutional neural network
  5. invariant cifar10 resnet
  6. reinforcement learning inverted pendulum Fourier series

Fitting a function

The examples below are super simple - just fit a shifted sin wave. Using the Lagrange Polynomial layers here a single input and ouput unit with no hidden layers is sufficient to fit the sin wave (as demonstrated below). I'm hoping this helps illustrate exactly what is going on and why one might want to use a technique like this. A comparison with a standard ReLU network with 1 and 2 hidden layers is provided for comparison.

Example - Simple Polynomial

Solution is for a linear, cubic and 5th order polynomial used in the synapse - there are 6 weights in the 5th order polynomial and 2 units total (1 input and 1 output).

Example 2 - Piecewise Discontinuous Polynomial (2 pieces)

Same problem, but comparison between 1st, 2nd and 5th order piecewise discontinuous polynomial synapse. This could be useful in problems that have discontinuties such as many problems in physics.

Example 3 - Piecewise Continuous Polynomial (2 pieces)

Same problem, but comparison between 1st, 2nd and 5th order piecewise continuous polynomial synapse.

Example 4 - Fourier series layer up to 5 frequencies

Same problem, but comparison between 1, 2 and 5 and 5 frequency fourier series.

Comparison with ReLU layer

ReLU network for comparison. 1 hidden layer with given number of units in each layer Adding a second layer and we get the result we expect. However, at the cost of a massive increase in the total number of weights. Since we are using a dense layer in the case of 5 units per layer we have a total of 35 weights. At 10 units per layer we have 120 weights + bias weights. 5th order polynomial pair has a total of 12 weights in the discontinuous case and 11 in the continuous case. So by moving to high order polynomials, it's possible the number of weights required decreases by as much as an order of magnitude - more research necessary, however this is inline with results from other fields. 2 hidden layers with given number of units in each layer

Available polynomial orders

import high_order_layers.PolynomialLayers as poly

#Non piecewise polynomials
poly.b1 #linear
poly.b2 #quadratic
poly.b3 #3rd order
boly.b4 #4th order
poly.b5 #5th order

## Discontinous piecewise polynomials, 2 pieces
poly.b1D #linear (discontinuous pair)
poly.b2D #quadratic (discontinuous pair)
poly.b3D #dubic (discontinuous pair)
poly.b4D #quartic (discontinuous pair)
poly.b5D #5th order (discontinuous pair)

## Continuous piecewise polynomials, 2 pieces
poly.b1C #linear (continuous pair)
poly.b2C #quadratic (continuous pair)
poly.b3C #cubic (continuous pair)
poly.b4C #quartic (continuous pair)
poly.b5C #5th order (continuous pair)

The layer inside tensorflow is then called (see mnist example above)

poly.Polynomial(units, input, basis=basis),

where units is the number of units and input is the size of the input and basis would be 'poly.b3' for example.

Fourier Series Layer

In addition there is a fourier series layer

import high_order_layers.FourierLayers as fourier
...
layer = fourier.Fourier(units, frequencies=10, length=2.0, shift=0.0)

where 'units' is the number of units, 'frequencies' is the number of frequencies to include and 'length' is the wavelength of the longest wave.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

high_order_layers-1.1.6.linux-x86_64.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

high_order_layers-1.1.6-py3-none-any.whl (9.6 kB view details)

Uploaded Python 3

File details

Details for the file high_order_layers-1.1.6.linux-x86_64.tar.gz.

File metadata

  • Download URL: high_order_layers-1.1.6.linux-x86_64.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.3.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.8.2

File hashes

Hashes for high_order_layers-1.1.6.linux-x86_64.tar.gz
Algorithm Hash digest
SHA256 b6159f11ce7eb644e9fa5b19402f3025cfe620f236aa5c53ee3a19e6def085e5
MD5 354c88da6aa89f67045b28f19f89f452
BLAKE2b-256 82712d10e0e2192a19e3f90b3a6a65a80c3e67576c31824939e1e5fbc494f3a8

See more details on using hashes here.

File details

Details for the file high_order_layers-1.1.6-py3-none-any.whl.

File metadata

  • Download URL: high_order_layers-1.1.6-py3-none-any.whl
  • Upload date:
  • Size: 9.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.3.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.8.2

File hashes

Hashes for high_order_layers-1.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 a54c104412190e91105600b3d0964e591890d7727e823a174a2992e75d88841a
MD5 07f213c57b64ae7328cd58fc18b92534
BLAKE2b-256 4bf86b359c8ef2ec4a1987ddde26ab45446babbe633d72a59fb5fb6b659ccb9d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page