Torchhd is a Python library for Hyperdimensional Computing and Vector Symbolic Architectures
Project description
Torchhd
Torchhd is a Python library for Hyperdimensional Computing (also known as Vector Symbolic Architectures).
- Easy-to-use: Torchhd makes it painless to develop a wide range of Hyperdimensional Computing (HDC) applications and algorithms. For someone new to the field, we provide Pythonic abstractions and examples to get you started fast. For the experienced researchers, we made the library modular by design, giving you endless flexibility to prototype new ideas in no-time.
- Performant: The library is build on top of the high-performance PyTorch library, giving you optimized tensor execution without the headaches. Moreover, PyTorch makes it effortless to accelerate your code on a GPU.
Installation
Torchhd is hosted on PyPi and Anaconda. First, install PyTorch using their installation instructions. Then, use one of the following commands to install Torchhd:
pip install torch-hd
conda install -c torchhd torchhd
Documentation
You can find documentation for Torchhd on the website.
Check out the Getting Started page for a quick overview.
The API documentation is divided into several sections:
You can improve the documentation by sending pull requests to this repository.
Examples
We have several examples in the repository. Here is a simple one to get you started:
import torch, torchhd
d = 10000 # number of dimensions
# create the hypervectors for each symbol
keys = torchhd.random(3, d)
country, capital, currency = keys
usa, mex = torchhd.random(2, d) # United States and Mexico
wdc, mxc = torchhd.random(2, d) # Washington D.C. and Mexico City
usd, mxn = torchhd.random(2, d) # US Dollar and Mexican Peso
# create country representations
us_values = torch.stack([usa, wdc, usd])
us = torchhd.hash_table(keys, us_values)
mx_values = torch.stack([mex, mxc, mxn])
mx = torchhd.hash_table(keys, mx_values)
# combine all the associated information
mx_us = torchhd.bind(torchhd.inverse(us), mx)
# query for the dollar of mexico
usd_of_mex = torchhd.bind(mx_us, usd)
memory = torch.cat([keys, us_values, mx_values], dim=0)
torchhd.cosine_similarity(usd_of_mex, memory)
# tensor([-0.0062, 0.0123, -0.0057, -0.0019, -0.0084, -0.0078, 0.0102, 0.0057, 0.3292])
# The hypervector for the Mexican Peso is the most similar.
This example is from the paper What We Mean When We Say "What's the Dollar of Mexico?": Prototypes and Mapping in Concept Space by Kanerva. It first creates hypervectors for all the symbols that are used in the computation, i.e., the variables for country
, capital
, and currency
and their values for both countries. These hypervectors are then combined to make a single hypervector for each country using a hash table structure. A hash table encodes key-value pairs as: k1 * v1 + k2 * v2 + ... + kn * vn
. The hash tables are then bound together to form their combined representation which is finally queried by binding with the Dollar hypervector to obtain the approximate Mexican Peso hypervector. The similarity output shows that the Mexican Peso hypervector is indeed the most similar one.
Supported HDC/VSA models
Currently, the library supports the following HDC/VSA models:
- Multiply-Add-Permute (MAP)
- Binary Spatter Codes (BSC)
- Holographic Reduced Representations (HRR)
- Fourier Holographic Reduced Representations (FHRR)
- Binary Sparse Block Codes (B-SBC)
- Vector-Derived Transformation Binding (VTB)
We welcome anyone to help with contributing more models to the library!
About
Initial development of Torchhd was performed by Mike Heddes and Igor Nunes as part of their research in Hyperdimensional Computing at the University of California, Irvine. The library was extended with significant contributions from Pere Vergés and Dheyay Desai. Torchhd later merged with a project by Rishikanth Chandrasekaran, who worked on similar problems as part of his research at the University of California, San Diego.
Contributing
We are always looking for people that want to contribute to the library. If you are considering contributing for the first time we acknowledgde that this can be daunting, but fear not! You can look through the open issues for inspiration on the kind of problems you can work on. If you are a researcher and want to contribute your work to the library, feel free to open a new issue so we can discuss the best strategy for integrating your work.
Documentation
To build the documentation locally do the following:
- Use
pip install -r docs/requirements.txt
to install the required packages. - Use
sphinx-build -b html docs build
to generate the html documentation in the/build
directory.
To create a clean build, remove the /build
and /docs/generated
directories.
Creating a New Release
- Increment the version number in version.py using semantic versioning.
- Create a new GitHub release. Set the tag according to PEP 440, e.g., v1.5.2, and provide a clear description of the changes. You can use GitHub's "auto-generate release notes" button. Look at previous releases for examples.
- A GitHub release triggers a GitHub action that builds the library and publishes it to PyPi and Conda in addition to the documentation website.
Running tests
To run the unit tests located in torchhd/tests
do the following:
- Use
pip install -r dev-requirements.txt
to install the required development packages. - Then run the tests using just
pytest
.
Optionally, to measure the code coverage use coverage run -m --omit="torchhd/tests/**" pytest
to create the coverage report. You can then view this report with coverage report
.
License
This library is MIT licensed.
To add the license to all source files, first install licenseheaders
and then use licenseheaders -t ./LICENSE -d ./torchhd
.
Cite
Consider citing our paper published in the Journal of Machine Learning Research (JMLR) if you use Torchhd in your work:
@article{JMLR:v24:23-0300,
author = {Heddes, Mike and Nunes, Igor and Vergés, Pere and Kleyko, Denis and Abraham, Danny and Givargis, Tony and Nicolau, Alexandru and Veidenbaum, Alex},
title = {Torchhd: An Open Source Python Library to Support Research on Hyperdimensional Computing and Vector Symbolic Architectures},
journal = {Journal of Machine Learning Research},
year = {2023},
volume = {24},
number = {255},
pages = {1--10},
url = {http://jmlr.org/papers/v24/23-0300.html}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file torch_hd-5.7.1.tar.gz
.
File metadata
- Download URL: torch_hd-5.7.1.tar.gz
- Upload date:
- Size: 100.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cc67bb16a14d380a082b1f52cbfb2a41ee19f89a113ab6842d663d1e2e1c3350 |
|
MD5 | 9eddafdf250de524fce66dd5e71ccc57 |
|
BLAKE2b-256 | 7886b0611b658fc67ef06e401265beea5bc1f3a21e2867238430fe7939d7e21b |
File details
Details for the file torch_hd-5.7.1-py3-none-any.whl
.
File metadata
- Download URL: torch_hd-5.7.1-py3-none-any.whl
- Upload date:
- Size: 356.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d056008694dfc7e88fec88e17b65417f9ea7d7e558dc1cb1227d33bb65d4e0ab |
|
MD5 | 7b6f98de4232a3b44ba53e34c831f7fc |
|
BLAKE2b-256 | 9537bc9d9b0977b46fc918335689e4bcef72a4639ecf1af5198beebcf1b330ac |