Skip to main content

A package for learning basis functions over arbitrary function sets. This allows even high-dimensional problems to be solved via a minimal number of basis functions. This allows for zero-shot transfer within these spaces, and also a mechanism for fully informative function representation via the coefficients of the basis functions. Hilbert spaces are nifty.

Project description

Function Encoder

A function encoder learns basis functions/vectors over arbitrary Hilbert spaces. This allows for zero-shot transfer within this learned by space by using a weighted combination of the basis functions to approximate any function in the space. The coefficients can be calculated quickly from a small amount of data, either using an inner product or the least squares method. The basis functions are learned from data as a neural network, which allows them to scale to high-dimensional function spaces. Furthermore, since the number of basis functions is fixed, this yields a fixed-size representation of the function which can be used for downstream tasks.

See the original paper for a mathematical introduction or the blog for an intuitive explanation of function encoders.

Examples

Each of the following examples can be found in the Examples/ directory. These examples illustrate the basis use of this repo and algorithm, but are by no means the extent of its applications or scalability.

Euclidean Vectors

This algorithm can be applied to any Hilbert space. To visualize what this algorithm looks like, we can apply it to Euclidean vectors. Watch as the basis vectors (black) converge to the Hilbert space being fit (blue square).

https://github.com/tyler-ingebrand/FunctionEncoder/assets/105821676/174ddf15-de2d-44dc-b7fe-6b5fad831a4b

Quadratics

A figure showing approximations over quadratics.

In the figure above, each panel shows a quadratic function (blue) and its function encoder approximation (orange). Small amounts of data are taken from each quadratic function, and used to compute a representation by taking the Monte Carlo approximation of the inner product between the function and basis functions. Then, the function is approximated as a weighted combination of basis functions. As you can see, a single set of learned basis functions is able to reproduce all nine of these quadratics accurately.

The basis functions look like this:

A figure showing the basis functions

Distributions

As distributions are also Hilbert spaces, we can apply the exact same algorithm. The only difference is the definition of the inner product. The black dots below are example data points, and the red area indicates the approximated probability density function. Just like in the quadratic example, the same basis functions are able to approximate the pdfs of all of these distributions.

A figure showing Gaussian donuts

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

functionencoder-0.0.1.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

FunctionEncoder-0.0.1-py3-none-any.whl (23.2 kB view details)

Uploaded Python 3

File details

Details for the file functionencoder-0.0.1.tar.gz.

File metadata

  • Download URL: functionencoder-0.0.1.tar.gz
  • Upload date:
  • Size: 15.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for functionencoder-0.0.1.tar.gz
Algorithm Hash digest
SHA256 460eb3a4f90b57fcc740eee2b42be181be8d5eb7875bf84545f8897228195d25
MD5 79b96f19a6cc7b62e142589cc46c77e3
BLAKE2b-256 54f01d9619ad79bce2c2b5d42dd7abc1067cf87c71be5ebee16d5d81367b7119

See more details on using hashes here.

File details

Details for the file FunctionEncoder-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for FunctionEncoder-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1f9c511e801e21533310081fbb50e0cb2b1a39943422063aa1617479ab0380c6
MD5 c67150abce4a1ce73f80ccf2f8e75086
BLAKE2b-256 52c3b5428f74f9db1bdfea692d02cb4226659f625baf1079a5ec4e0b1461a494

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page