Skip to main content

JAX Image Models

Project description

JAX Image Models

Future home of JAX Image Models (jimm). Sibling library of https://github.com/rwightman/pytorch-image-models. Like timm, jimm will become a collection of JAX based image models w/ pretrained weights focused on transfer learning. The first models to be include here will the Flax Linen JAX adaptation of the MBConv family (EfficientNet, MobileNetV2/V3, etc) https://github.com/rwightman/efficientnet-jax.

jimm will be built while exploring transfer learning and the impact of different augmentation, regularization, supervised, semi-supervised, self-supervised pretraining techniques on the transferability of weights for different target tasks.

Specifically I hope to compare transfer learning on a wide variety of models for a variety of target datasets and tasks across:

  • supervised ImageNet-1k training with heavy augmentation and regularization
  • supervised ImageNet-1k training with light augmentation and regularization
  • supervised 'larger' dataset (ImageNet-21k, OpenImages) training w/ light (may try heavy since this isn't JFT-300M) augmentation and regularization
  • semi-supervised and self-supervised pretraining (SimCLRV2, BYOL, FixMatch, etc) on ImageNet-1k/21k, OpenImages

There is already a body of research work on the subject of transfer learning. Much of my work here will not be breaking new ground but providing me with an opportunity to learn and do what I do best -- refine and improve. Some papers in this space:

Papers on self or semi-supervised techniques that I plan to explore

The scope of 'transfer learning' will initially cover fine-tuning (head replaced with newly initialized task-specific head, 0..N layers frozen). I may explore linear classifier or low-shot techniques from semi/self-supervised pretraining later.

I'm currently planning which datasets to select for transfer learning benchmarks. I'm hoping to explore a cross section of natural image datasets (that don't overlap with imagenet), and other (medical, spectrogram, industrial inspection, etc). I'd like to cover a different cross section of datasets than usual, I hope to find some interesting options from various Kaggle challenges, etc that are available with compatible licenses. Dataset suggestions welcome.

The development of jimm does not mean I'm abandoning timm. I will build the models in a manner that allows easy movements of weights back and forth. I'm interested in building in JAX because a) I've enjoyed my JAX exploration so far b) I have some TPU credits that allows more compute intensive exploration than my open source training budget would allow. I will be augmenting that with my local NVIDIA GPU resources.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jimm-0.0.1.tar.gz (3.5 kB view details)

Uploaded Source

Built Distribution

jimm-0.0.1-py3-none-any.whl (7.1 kB view details)

Uploaded Python 3

File details

Details for the file jimm-0.0.1.tar.gz.

File metadata

  • Download URL: jimm-0.0.1.tar.gz
  • Upload date:
  • Size: 3.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.0.post20201006 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.8.5

File hashes

Hashes for jimm-0.0.1.tar.gz
Algorithm Hash digest
SHA256 0091ca0d9dc39dbc2d9b4936d3428821ecb9c35ee56184d54c400527e2e53b04
MD5 3edb254579ab6171c623d9d1ba53837b
BLAKE2b-256 e14fc1428fb74b704c85ba8c992e56309255e10b2eeae1ac92f809ad1e69fd09

See more details on using hashes here.

File details

Details for the file jimm-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: jimm-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 7.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.0.post20201006 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.8.5

File hashes

Hashes for jimm-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cbeed93f4bfae5abd0ca3f9c0019c761ee958333da467deefccf0191170c5a11
MD5 28884d0022a33a5d3ec82336803eca08
BLAKE2b-256 4019be8ea4a919e1b7d1f02a88662e423c2c3319cad067e17636d5c5c4bb6616

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page