Skip to main content

lpips-j – Minimal JAX/Flax port of `lpips` supporting `vgg16`, with pre-trained weights stored in the 🤗 Hugging Face hub.

Project description

LPIPS-J

This is a minimal JAX/Flax port of lpips, as implemented in:

Only the essential features have been implemented. Our motivation is to support VQGAN training for DALL•E Mini.

It currently supports the vgg16 backend, leveraging the implementation in flaxmodels.

Pre-trained weights for the network and the linear layers are downloaded from the 🤗 Hugging Face hub.

Installation

  1. Install JAX for CUDA or TPU following the instructions at https://github.com/google/jax#installation.
  2. Install this package from the repository:
    pip install --upgrade git+https://github.com/pcuenca/lpips-j.git
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lpips-j-0.0.6.tar.gz (7.2 kB view hashes)

Uploaded Source

Built Distribution

lpips_j-0.0.6-py3-none-any.whl (7.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page