Skip to main content

A package for learning basis functions over arbitrary function sets. This allows even high-dimensional problems to be solved via a minimal number of basis functions. This allows for zero-shot transfer within these spaces, and also a mechanism for fully informative function representation via the coefficients of the basis functions. Hilbert spaces are nifty.

Project description

Function Encoder

A function encoder learns basis functions/vectors over arbitrary Hilbert spaces. This allows for zero-shot transfer within this learned by space by using a weighted combination of the basis functions to approximate any function in the space. The coefficients can be calculated quickly from a small amount of data, either using an inner product or the least squares method. The basis functions are learned from data as a neural network, which allows them to scale to high-dimensional function spaces. Furthermore, since the number of basis functions is fixed, this yields a fixed-size representation of the function which can be used for downstream tasks.

See the original paper for a mathematical introduction or the blog for an intuitive explanation of function encoders.

Installation

For the latest stable release:

pip install FunctionEncoder

For the latest version:

pip install git+https://github.com/tyler-ingebrand/FunctionEncoder.git

Examples

Each of the following examples can be found in the Examples/ directory. These examples illustrate the basis use of this repo and algorithm, but are by no means the extent of its applications or scalability.

Euclidean Vectors

This algorithm can be applied to any Hilbert space. To visualize what this algorithm looks like, we can apply it to Euclidean vectors. Watch as the basis vectors (black) converge to the Hilbert space being fit (blue square).

https://github.com/tyler-ingebrand/FunctionEncoder/assets/105821676/174ddf15-de2d-44dc-b7fe-6b5fad831a4b

Quadratics

A figure showing approximations over quadratics.

In the figure above, each panel shows a quadratic function (blue) and its function encoder approximation (orange). Small amounts of data are taken from each quadratic function, and used to compute a representation by taking the Monte Carlo approximation of the inner product between the function and basis functions. Then, the function is approximated as a weighted combination of basis functions. As you can see, a single set of learned basis functions is able to reproduce all nine of these quadratics accurately.

The basis functions look like this:

A figure showing the basis functions

Distributions

As distributions are also Hilbert spaces, we can apply the exact same algorithm. The only difference is the definition of the inner product. The black dots below are example data points, and the red area indicates the approximated probability density function. Just like in the quadratic example, the same basis functions are able to approximate the pdfs of all of these distributions.

A figure showing Gaussian donuts

Related Papers

Citation

If you use this repo for research, please cite

@article{Ingebrand2024,
  author       = {Tyler Ingebrand and
                  Amy Zhang and
                  Ufuk Topcu},
  title        = {Zero-Shot Reinforcement Learning via Function Encoders},
  booktitle    = {{ICML}},
  year         = {2024},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

functionencoder-0.0.5.tar.gz (16.5 kB view details)

Uploaded Source

Built Distribution

FunctionEncoder-0.0.5-py3-none-any.whl (23.9 kB view details)

Uploaded Python 3

File details

Details for the file functionencoder-0.0.5.tar.gz.

File metadata

  • Download URL: functionencoder-0.0.5.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for functionencoder-0.0.5.tar.gz
Algorithm Hash digest
SHA256 a8e5c4687f9b973fd760712ea64140152001dfbe73a50c8caec36363d9950ec7
MD5 c14c07fcea4e58cb8e7d20c065e7ccd4
BLAKE2b-256 66616dd0dd460d82b5b6ddadc1cc9959258f428d2caa00b1e180fe940fe96e9e

See more details on using hashes here.

File details

Details for the file FunctionEncoder-0.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for FunctionEncoder-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 e738bdffe602d33ca0f1bc00db5e83fb377ae06b9745a0bf94883c009a770230
MD5 67e81e1a1784082df21ce30fe7e81f24
BLAKE2b-256 e5f8a371819748a18860fe1b785382fe254393ee85200d71781d2a5e7082255c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page