Hash-based Deep Learning
Project description
Table of Contents
Overview
This repository is non-official third-paty re-implementation of SLIDE1.
We provide
- Python package
- Hash based Deep Learning
- Parallel computing based on C++17 parallel STL
We don't provide
- Explicit CPU optimized code like AVX (We just rely on compiler optimization)
- Compiled binary (You need to compile by yourself)
Install
There are two options, "Install from PyPI" and "Install from Source". For ordinary user, "Install from PyPI" is recommended.
For both case, sufficient C++ compiler is neccessary.
Requirement
- Recent C++ compiler with parallel STL algorithm support
- Python 3
Requirements can be installed on Docker image gcc:10.
# On local machine
docker run -it gcc:10 bash
# On gcc:10 image
apt update && apt install -y python3-pip libtbb-dev
Install from PyPI
pip install HashDL
Install from Source
git clone https://gitlab.com/ymd_h/hashdl.git HashDL
cd HashDL
pip install .
Features
- Neural Network
- hash-based sparse dense layer
- Activation
- ReLU
- linear (no activation)
- sigmoid
- Optimizer
- SGD
- Adam2
- Weight Initializer
- constant
- Gauss distribution
- Hash for similarity
- WTA
- DWTA3
- Scheduler for hash update
- constant
- exponential decay
In the current architecture, CNN is impossible.
Implementation
The official reference implementation focused on performance and accepted some "dirtyness" like hard-coded magic number for algotihm selection and unmanaged memory allocation.
We accept some (but hopefully small) overhead and improve maintenability in terms of software;
- Polymorphism with inheritance and virtual function
- RAII and smart pointer for memory management
These archtecture allows us to construct and manage C++ class from Python without recompile.
We also rely recent C++ standard and compiler optimization;
- Parallel STL from C++17
- Because of RVO (or at least move semantics), returning
std::vector
is not so much costful as it was.
Footnotes
1 B. Chen et al., "SLIDE : In Defense of Smart Algorithms over Hardware Acceleration for Large-Scale Deep Learning Systems", MLSys 2020 (arXiv, code)
2 D. P. Kingma and J. Ba, "Adam: A Method for Stochastic Optimization", ICLR (2015) (arXiv)
3 B. Chen et al., "Densified Winner Take All (WTA) Hashing for Sparse Datasets", Uncertainty in artificial intelligence (2018)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file HashDL-4.0.0.tar.gz
.
File metadata
- Download URL: HashDL-4.0.0.tar.gz
- Upload date:
- Size: 202.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a06cd562d9c558ab5d4f0f63acef8f8b790b7cd4a061bd140ffedfd43f250916 |
|
MD5 | 94d60b42afd962bf80a62d282f5246ff |
|
BLAKE2b-256 | 1c0600f13f029f41801097366aa069850308eb34416994aafea9ed37b55b4cc3 |