Skip to main content

Image similarity, metric learning loss functions for TensorFlow 2+.

Project description

tf-metric-learning

TensorFlow 2.2 Python 3.6

Overview

Minimalistic open-source library for metric learning written in TensorFlow2, TF-Addons, Numpy, OpenCV(CV2) and Annoy. This repository contains a TensorFlow2+/tf.keras implementation some of the loss functions and miners. This repository was inspired by pytorch-metric-learning.

Features

  • All the loss functions are implemented as tf.keras.layers.Layer
  • Callbacks for Computing Recall, Visualize Embeddings in TensorBoard Projector
  • Simple Mining mechanism with Annoy

Open-source repos

This library contains code that has been adapted and modified from the following great open-source repos, without them this will be not possible (THANK YOU):

TODO

  • Easy installation/Publishing on pypi.org
  • Discriminative layer optimizer (different learning rates) for Loss with weights (Proxy, SoftTriple, ...) TODO
  • Some Tests 😇
  • Improve API with combination of multiple Loss Functinos
  • Improve and add more minerss

Examples

import tensorflow as tf
import numpy as np
from tf_metric_learning.layers import SoftTripleLoss

num_class, num_centers, embedding_size = 10, 2, 256

inputs = tf.keras.Input(shape=(embedding_size), name="embeddings")
input_label = tf.keras.layers.Input(shape=(1,), name="labels")
output_tensor = SoftTripleLoss(num_class, num_centers, embedding_size)(inputs, input_label)

model = tf.keras.Model(inputs=[inputs, input_label], outputs=output_tensor)
model.compile(optimizer="adam")

data = {"embeddings" : np.asarray([np.zeros(256) for i in range(1000)]), "labels": np.zeros(1000, dtype=np.float32)}
model.fit(data, None, epochs=10, batch_size=10)

More complex scenarios:

Features

Loss functions

Miners

  • MaximumLossMiner [TODO]
  • TripletAnnoyMiner ✅

Evaluators

  • AnnoyEvaluator Callback: for evaluation Recall@K, you will need to install Spotify annoy library.
import tensorflow as tf
from tf_metric_learning.utils.recall import AnnoyEvaluatorCallback

evaluator = AnnoyEvaluatorCallback(
    base_network,
    {"images": test_images[:divide], "labels": test_labels[:divide]}, # images stored to index
    {"images": test_images[divide:], "labels": test_labels[divide:]}, # images to query
    normalize_fn=lambda images: images / 255.0,
    normalize_eb=True,
    eb_size=embedding_size,
    freq=1,
)

Visualizations

  • Tensorboard Projector Callback
import tensorflow as tf
from tf_metric_learning.utils.projector import TBProjectorCallback

def normalize_images(images):
    return images/255.0

(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.cifar10.load_data()
...

projector = TBProjectorCallback(
    base_model,
    "tb/projector",
    test_images, # list of images
    np.squeeze(test_labels),
    normalize_eb=True,
    normalize_fn=normalize_images
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tf-metric-learning-0.1.0.tar.gz (15.3 kB view hashes)

Uploaded Source

Built Distribution

tf_metric_learning-0.1.0-py3-none-any.whl (27.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page