Skip to main content

No project description provided

Project description

Dynamic Non-Uniform Piecewise Linear Layers

Example adding nodes

Example of function approximation Training progress animation

Example moving nodes (square wave)

Dynamic square wave

Example moving nodes (circle)

Dynamic Circle

Layer weights for the circle

Dynamic Circle Weights

A PyTorch implementation of non-uniform piecewise linear layers. These layers can learn arbitrary continuous piecewise linear functions, where both the positions (x-coordinates) and values (y-coordinates) of the control points are learned parameters.

I typically run these in either dynamic mode, i.e, adding nodes or adaptively where the nodes are moved, but conserving total number of nodes.

this is a work in progress

Example moving nodes implicit representation

Using 40 neurons in a single hidden layer

Moving nodes implicit representation

Function Approximation Example

See examples/sine_fitting.py for a complete example of approximating a complex function using the non-uniform piecewise linear layer.

Square Wave

Non default example

python examples/dynamic_square_wave.py training.adapt=move model.num_points=20 training.refine_every_n_epochs=10 data.num_points=100

MNIST

Running with and moving nodes with varrying number of points. You can run with larger learning_rate, to get faster results

python examples/mnist_classification.py -m model_type=adaptive epochs=100 move_nodes=True,False num_points=10 learning_rate=1e-4

Shakespeare

Approaching good results with things like this

python examples/shakespeare_generation.py -m training.learning_rate=1e-3 training.num_epochs=20 training.move_every_n_batches=200 model.hidden_size=32,64 model.num_points=32 training.batch_size=128

small memory machine

python examples/shakespeare_generation.py -m training.learning_rate=1e-3 training.num_epochs=20 training.move_every_n_batches=50 model.hidden_size=16 model.num_points=32 training.batch_size=64 training.adapt=move

2D Implicit Representation

uv run python examples/implicit_image.py model.normalization=noop training.num_epochs=20

3D Implicit Representation

This one is pretty solid

python examples/implicit_3d.py mesh_resolution=100 learning_rate=1e-5 hidden_layers=[40, 40]

rendering after run

python examples/implicit_3d.py render_only=true model_path=/path/to/model.pt high_res_resolution=256

different output name

python examples/implicit_3d.py render_high_res=true render_output_file="my_render.png"

Running visualization tests

use the -v to write data to file

pytest tests/test_visualization.py -v

Interesting Papers

Deep Networks Always Grok and Here's Why

Papers below are not currently used in this project, but would be interesting to investigate in the future Oja’s plasticity rule overcomes several challenges of training neural networks under biological constraints

Continual Learning with Hebbian Plasticity in Sparse and Predictive Coding Networks: A Survey and Perspective

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

non_uniform_piecewise_layers-0.1.0.tar.gz (55.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

non_uniform_piecewise_layers-0.1.0-py3-none-any.whl (59.6 kB view details)

Uploaded Python 3

File details

Details for the file non_uniform_piecewise_layers-0.1.0.tar.gz.

File metadata

File hashes

Hashes for non_uniform_piecewise_layers-0.1.0.tar.gz
Algorithm Hash digest
SHA256 64f5906d082d850517d691333c4c901b3e5c866ead347b9b947a43e1acf9d16a
MD5 4359f4ea5836f39e366089a7cd86b326
BLAKE2b-256 c51d9aced41e2377b803555684368bcf7916d7dc84f15ea77882f650dcbdc920

See more details on using hashes here.

File details

Details for the file non_uniform_piecewise_layers-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for non_uniform_piecewise_layers-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f584d4f6cdc89fa2ea9661e2f525bef29a3353e0feb02af6a244c579ecafe83e
MD5 38dab7bc5fc38d1ee66128325d65f372
BLAKE2b-256 91b6ae0e6558b54a9d789d476f624181f52addc032c99974e2a20103beb53aa1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page