Skip to main content

No project description provided

Project description

Dynamic Non-Uniform Piecewise Linear Layers

Example adding nodes

Example of function approximation Training progress animation

Example moving nodes (square wave)

Dynamic square wave

Example moving nodes (circle)

Dynamic Circle

Layer weights for the circle

Dynamic Circle Weights

A PyTorch implementation of non-uniform piecewise linear layers. These layers can learn arbitrary continuous piecewise linear functions, where both the positions (x-coordinates) and values (y-coordinates) of the control points are learned parameters.

I typically run these in either dynamic mode, i.e, adding nodes or adaptively where the nodes are moved, but conserving total number of nodes.

this is a work in progress

Example moving nodes implicit representation

Using 40 neurons in a single hidden layer

Moving nodes implicit representation

Function Approximation Example

See examples/sine_fitting.py for a complete example of approximating a complex function using the non-uniform piecewise linear layer.

Square Wave

Non default example

python examples/dynamic_square_wave.py training.adapt=move model.num_points=20 training.refine_every_n_epochs=10 data.num_points=100

MNIST

Running with and moving nodes with varrying number of points. You can run with larger learning_rate, to get faster results

python examples/mnist_classification.py -m model_type=adaptive epochs=100 move_nodes=True,False num_points=10 learning_rate=1e-4

Shakespeare

Approaching good results with things like this

python examples/shakespeare_generation.py -m training.learning_rate=1e-3 training.num_epochs=20 training.move_every_n_batches=200 model.hidden_size=32,64 model.num_points=32 training.batch_size=128

small memory machine

python examples/shakespeare_generation.py -m training.learning_rate=1e-3 training.num_epochs=20 training.move_every_n_batches=50 model.hidden_size=16 model.num_points=32 training.batch_size=64 training.adapt=move

2D Implicit Representation

uv run python examples/implicit_image.py model.normalization=noop training.num_epochs=20

3D Implicit Representation

This one is pretty solid

python examples/implicit_3d.py mesh_resolution=100 learning_rate=1e-5 hidden_layers=[40, 40]

rendering after run

python examples/implicit_3d.py render_only=true model_path=/path/to/model.pt high_res_resolution=256

different output name

python examples/implicit_3d.py render_high_res=true render_output_file="my_render.png"

Running visualization tests

use the -v to write data to file

pytest tests/test_visualization.py -v

Interesting Papers

Deep Networks Always Grok and Here's Why

Papers below are not currently used in this project, but would be interesting to investigate in the future Oja’s plasticity rule overcomes several challenges of training neural networks under biological constraints

Continual Learning with Hebbian Plasticity in Sparse and Predictive Coding Networks: A Survey and Perspective

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

non_uniform_piecewise_layers-0.3.0.tar.gz (56.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

non_uniform_piecewise_layers-0.3.0-py3-none-any.whl (59.6 kB view details)

Uploaded Python 3

File details

Details for the file non_uniform_piecewise_layers-0.3.0.tar.gz.

File metadata

File hashes

Hashes for non_uniform_piecewise_layers-0.3.0.tar.gz
Algorithm Hash digest
SHA256 5265e8d845b2ce5ce5d442994c53800016e74d4fcf4a5635a3c274c378787920
MD5 c290781797103ce026e1a8dbc16df41b
BLAKE2b-256 a470a5338af00842910010927b92be3a97957b58ea2b3b01c9d85335e4dcccec

See more details on using hashes here.

File details

Details for the file non_uniform_piecewise_layers-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for non_uniform_piecewise_layers-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5eb3b8af3808a63e1991e1ae402b1ade0bad70239f47c1763ea3ea4a82ad6e1c
MD5 16b14047f02698aa6d6296239a857990
BLAKE2b-256 46278d48a784848c43e32b87c3518fc8ee387a1940438773afba1ab3dd8de756

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page