standardise the FID
Project description
clean-fid: Fixing Inconsistencies in FID
The FID calculation involves many steps that can produce inconsistencies in the final metric. Different implementations use different low level image processing (which are often implemented incorrectly). We provide this library to address the issues found and make the FID values consistent across different methods.
On Buggy Resizing Libraries and Surprising Subtleties in FID Calculation
Gaurav Parmar, Richard Zhang, Jun-Yan Zhu
CMU and Adobe
Buggy Resizing Operations
Resizing operation is often implemented incorrectly by popular libraries.
JPEG Image Compression
Image compression can have a surprisingly large effect on FID.
Quick Start
-
install requirements
pip install -r requirements.txt
-
install the library (for now build from source)
pip install clean-fid
-
FID between two image folders
import cleanfid.fid as fid score = fid.compare_folders(fdir1, fdir2, num_workers=0, batch_size=8, device=torch.device("cuda"), use_legacy_pytorch=False, use_legacy_tensorflow=False,)
-
FID of a folder of generated images
import cleanfid.fid as fid score = fid.fid_folder(fdir, dataset_name="FFHQ", dataset_res=1024, model=None, use_legacy_pytorch=False, use_legacy_tensorflow=False, num_workers=12, batch_size=128, device=torch.device("cuda"))
-
FID inline
import cleanfid.fid as fid # function that accepts a latent and returns an image in range[0,255] gen = lambda z: return GAN(latent=z, ... , <other_flags>) fid_score = fid.fid_model(gen, dataset_name="FFHQ, dataset_res=1024, model=None, z_dim=512, num_fid=50_000, use_legacy_pytorch=False, use_legacy_tensorflow=False, num_workers=0, batch_size=128, device=torch.device("cuda"))
Make Custom Dataset Statistics
- dataset_path: folder where the dataset images are stored
- Generate and save the inception statistics
import numpy as np import cleanfid.fid as fid dataset_path = ... mu, sigma = fid.get_folder_features(dataset_path, num=50_000) np.savez_compressed("stats.npz", mu=mu, sigma=sigma)
- See
examples/ffhq_stats.py
for a concrete example
Backwards Compatibility
We provide two flags to reproduce the legacy FID score.
-
use_legacy_pytorch
This flag is equivalent to using the popular PyTorch FID implementation provided here
The difference between using CleanFID withuse_legacy_pytorch
flag and code is ~1.9e-06
See doc for how the methods are compared -
use_legacy_tensorflow
This flag is equivalent to using the official implementation of FID released by the authors. Note that in order to use this flag, you need to additionally install tensorflow.
CleanFID Leaderboard for common tasks
FFHQ @ 1024x1024
Model | Legacy-FID | Clean-FID |
---|---|---|
StyleGAN2 | 2.85 ± 0.05 | 3.08 ± 0.05 |
StyleGAN | 4.44 ± 0.04 | 4.82 ± 0.04 |
MSG-GAN | 6.09 ± 0.04 | 6.58 ± 0.06 |
Image-to-Image (horse->zebra @ 256x256) Computed using test images
Model | Legacy-FID | Clean-FID |
---|---|---|
MUNIT | ||
pix2pix (paired) | ||
CycleGAN | 77.20 | 75.17 |
CUT | 45.51 | 43.71 |
(cityscapes @ AxA)
Model | Legacy-FID | Clean-FID |
---|---|---|
MUNIT | ||
pix2pix (paired) | ||
CycleGAN | ||
CUT |
Building from source
python setup.py bdist_wheel
pip install dist/CleanFID-0.0.1-py3-none-any.whl
Credits
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for clean_fid-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2b86b41613e33b1830368c91e4b5cbdc278f8627a67c5bccd130d1b4fb50febf |
|
MD5 | f4f289cf7e6a0af2d23912a8b06f1c34 |
|
BLAKE2b-256 | 839fd924b124d1f8e4d755be1144f67968294978789fc48b880dd24045126712 |