Skip to main content

Torchhd is a Python library for Hyperdimensional Computing and Vector Symbolic Architectures

Project description

GitHub license pypi version conda version tests status PRs Welcome

Torchhd

Torchhd is a Python library for Hyperdimensional Computing (also known as Vector Symbolic Architectures).

  • Easy-to-use: Torchhd makes it painless to develop a wide range of Hyperdimensional Computing (HDC) applications and algorithms. For someone new to the field, we provide Pythonic abstractions and examples to get you started fast. For the experienced researchers, we made the library modular by design, giving you endless flexibility to prototype new ideas in no-time.
  • Performant: The library is build on top of the high-performance PyTorch library, giving you optimized tensor execution without the headaches. Moreover, PyTorch makes it effortless to accelerate your code on a GPU.

Installation

Torchhd is hosted on PyPi and Anaconda. First, install PyTorch using their installation instructions. Then, use one of the following commands to install Torchhd:

pip install torch-hd
conda install -c torchhd torchhd

Documentation

You can find documentation for Torchhd on the website.

Check out the Getting Started page for a quick overview.

The API documentation is divided into several sections:

You can improve the documentation by sending pull requests to this repository.

Examples

We have several examples in the repository. Here is a simple one to get you started:

import torch, torchhd

d = 10000  # number of dimensions

# create the hypervectors for each symbol
keys = torchhd.random(3, d)
country, capital, currency = keys

usa, mex = torchhd.random(2, d)  # United States and Mexico
wdc, mxc = torchhd.random(2, d)  # Washington D.C. and Mexico City
usd, mxn = torchhd.random(2, d)  # US Dollar and Mexican Peso

# create country representations
us_values = torch.stack([usa, wdc, usd])
us = torchhd.hash_table(keys, us_values)

mx_values = torch.stack([mex, mxc, mxn])
mx = torchhd.hash_table(keys, mx_values)

# combine all the associated information
mx_us = torchhd.bind(torchhd.inverse(us), mx)

# query for the dollar of mexico
usd_of_mex = torchhd.bind(mx_us, usd)

memory = torch.cat([keys, us_values, mx_values], dim=0)
torchhd.cosine_similarity(usd_of_mex, memory)
# tensor([-0.0062,  0.0123, -0.0057, -0.0019, -0.0084, -0.0078,  0.0102,  0.0057,  0.3292])
# The hypervector for the Mexican Peso is the most similar.

This example is from the paper What We Mean When We Say "What's the Dollar of Mexico?": Prototypes and Mapping in Concept Space by Kanerva. It first creates hypervectors for all the symbols that are used in the computation, i.e., the variables for country, capital, and currency and their values for both countries. These hypervectors are then combined to make a single hypervector for each country using a hash table structure. A hash table encodes key-value pairs as: k1 * v1 + k2 * v2 + ... + kn * vn. The hash tables are then bound together to form their combined representation which is finally queried by binding with the Dollar hypervector to obtain the approximate Mexican Peso hypervector. The similarity output shows that the Mexican Peso hypervector is indeed the most similar one.

Supported HDC/VSA models

Currently, the library supports the following HDC/VSA models:

We welcome anyone to help with contributing more models to the library!

About

Initial development of Torchhd was performed by Mike Heddes and Igor Nunes as part of their research in Hyperdimensional Computing at the University of California, Irvine. The library was extended with significant contributions from Pere Vergés and Dheyay Desai. Torchhd later merged with a project by Rishikanth Chandrasekaran, who worked on similar problems as part of his research at the University of California, San Diego.

Contributing

We are always looking for people that want to contribute to the library. If you are considering contributing for the first time we acknowledgde that this can be daunting, but fear not! You can look through the open issues for inspiration on the kind of problems you can work on. If you are a researcher and want to contribute your work to the library, feel free to open a new issue so we can discuss the best strategy for integrating your work.

Documentation

To build the documentation locally do the following:

  1. Use pip install -r docs/requirements.txt to install the required packages.
  2. Use sphinx-build -b html docs build to generate the html documentation in the /build directory.

To create a clean build, remove the /build and /docs/generated directories.

Creating a New Release

  1. Increment the version number in version.py using semantic versioning.
  2. Create a new GitHub release. Set the tag according to PEP 440, e.g., v1.5.2, and provide a clear description of the changes. You can use GitHub's "auto-generate release notes" button. Look at previous releases for examples.
  3. A GitHub release triggers a GitHub action that builds the library and publishes it to PyPi and Conda in addition to the documentation website.

Running tests

To run the unit tests located in torchhd/tests do the following:

  1. Use pip install -r dev-requirements.txt to install the required development packages.
  2. Then run the tests using just pytest.

Optionally, to measure the code coverage use coverage run -m --omit="torchhd/tests/**" pytest to create the coverage report. You can then view this report with coverage report.

License

This library is MIT licensed.

To add the license to all source files, first install licenseheaders and then use licenseheaders -t ./LICENSE -d ./torchhd.

Cite

Consider citing our paper published in the Journal of Machine Learning Research (JMLR) if you use Torchhd in your work:

@article{JMLR:v24:23-0300,
  author  = {Heddes, Mike and Nunes, Igor and Vergés, Pere and Kleyko, Denis and Abraham, Danny and Givargis, Tony and Nicolau, Alexandru and Veidenbaum, Alex},
  title   = {Torchhd: An Open Source Python Library to Support Research on Hyperdimensional Computing and Vector Symbolic Architectures},
  journal = {Journal of Machine Learning Research},
  year    = {2023},
  volume  = {24},
  number  = {255},
  pages   = {1--10},
  url     = {http://jmlr.org/papers/v24/23-0300.html}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch-hd-5.5.0.tar.gz (96.5 kB view hashes)

Uploaded Source

Built Distribution

torch_hd-5.5.0-py3-none-any.whl (351.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page