Unofficial JAX implementation of Deep Learning models
Project description
JAX Models
Table of Contents
<li>
<a href="#about-the-project">About The Project</a>
</li>
<li>
<a href="#getting-started">Getting Started</a>
<ul>
<li><a href="#prerequisites">Prerequisites</a></li>
<li><a href="#installation">Installation</a></li>
<li><a href="#usage">Usage</a></li>
</ul>
</li>
<li><a href="#contributing">Contributing</a></li>
<li><a href="#license">License</a></li>
<li><a href="#contact">Contact</a></li>
About The Project
The JAX Models repository aims to provide open sourced JAX/Flax implementations for research papers originally without code or code written with frameworks other than JAX. The goal of this project is to make a collection of models, layers, activations and other utilities that are most commonly used for research. All papers and derived or translated code is cited in either the README or the docstrings. If you think that any citation is missed then please raise an issue.
All implementations provided here are available on Papers With Code.
Available model implementations for JAX are:
-
MetaFormer is Actually What You Need for Vision (Weihao Yu et al., 2021)
-
Augmenting Convolutional networks with attention-based aggregation (Hugo Touvron et al., 2021)
-
MPViT : Multi-Path Vision Transformer for Dense Prediction (Youngwan Lee et al., 2021)
-
MLP-Mixer: An all-MLP Architecture for Vision (Ilya Tolstikhin et al., 2021)
-
Patches Are All You Need (Anonymous et al., 2021)
-
SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers (Enze Xie, et al., 2021)
Available layers for out-of-the-box integration:
-
DropPath (Stochastic Depth) (Gao Huang et al., 2021)
-
Squeeze-and-Excitation Layer (Jie Hu et al. 2019)
-
Depthwise Convolution (François Chollet, 2017)
Prerequisites
Prerequisites can be installed separately through the requirements.txt
file in the main directory using:
pip install -r requirements.txt
The use of a virtual environment is highly recommended to avoid version incompatibilites.
Installation
This project is built with Python 3 for the latest JAX/Flax versions and can be directly installed via pip.
pip install jax-models
If you wish to use the latest version then you can directly clone the repository too.
git clone https://github.com/DarshanDeshpande/jax-models.git
Usage
To see all model architectures available:
from jax_models.models.model_registry import list_models
from pprint import pprint
pprint(list_models())
To load your desired model:
from jax_models.models.model_registry import load_model
load_model('mpvit-base', attach_head=True, num_classes=1000)
Contributing
Please raise an issue if any implementation gives incorrect results, crashes unexpectedly during training/inference or if any citation is missing.
You can contribute to jax_models
by supporting me with compute resources or by contributing your own resources to provide pretrained weights.
If you wish to donate to this inititative then please drop me a mail here.
License
Distributed under the Apache 2.0 License. See LICENSE
for more information.
Contact
Feel free to reach out for any issues or requests related to these implementations
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file jax_models-0.0.2.tar.gz
.
File metadata
- Download URL: jax_models-0.0.2.tar.gz
- Upload date:
- Size: 19.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 843a8012a913d87cfa1b1b7ae0d07b8b89aa627400f1502a54d5aa6bb3d6ab85 |
|
MD5 | 4985ef19676e644613bffc1ab54159dd |
|
BLAKE2b-256 | f98656797900672fb49e83ace58ec9c7587b872ed5202913e7d54263347678c6 |
File details
Details for the file jax_models-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: jax_models-0.0.2-py3-none-any.whl
- Upload date:
- Size: 22.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c6c9dc44daeb0f3e41d0afdb942167f9610134855540d9466c8fc73983b3c332 |
|
MD5 | 6a1d18305cd858820c682d6f96ec1c22 |
|
BLAKE2b-256 | 47c7f8583abb547e08559ef1f22058d64a975d112eaa56d5cfbf9b6327828749 |