Skip to main content

A flexible and extensible metric learning library, written in PyTorch.

Project description

pytorch_metric_learning

Installation:

pip install pytorch_metric_learning

Use a loss function by itself

from pytorch_metric_learning import losses
loss_func = losses.TripletMarginLoss(normalize_embeddings=False, margin=0.1)
loss = loss_func(embeddings, labels)

Or combine miners and loss functions, regardless of whether they mine or compute loss using pairs or triplets. Pairs are converted to triplets when necessary, and vice versa.

from pytorch_metric_learning import miners, losses
miner = miners.MultiSimilarityMiner(epsilon=0.1)
loss_func = losses.TripletMarginLoss(normalize_embeddings=False, margin=0.1)
hard_pairs = miner(embeddings, labels)
loss = loss_func(embeddings, labels, hard_pairs)

Train using more advanced approaches, like deep adversarial metric learning. For example:

from pytorch_metric_learning import trainers

# Set up your models, optimizers, loss functions etc.
models = {"trunk": your_trunk_model, 
          "embedder": your_embedder_model,
          "G_neg_model": your_negative_generator}

optimizers = {"trunk_optimizer": your_trunk_optimizer, 
              "embedder_optimizer": your_embedder_optimizer,
              "G_neg_model_optimizer": your_negative_generator_optimizer}

loss_funcs = {"metric_loss": losses.AngularNPairs(alpha=35),
              "synth_loss": losses.Angular(alpha=35), 
              "G_neg_adv": losses.Angular(alpha=35)}

mining_funcs = {}

loss_weights = {"metric_loss": 1, 
                "classifier_loss": 0,
                "synth_loss": 0.1,
                "G_neg_adv": 0.1,
                "G_neg_hard": 0.1,
                "G_neg_reg": 0.1}

# Create trainer object
trainer = trainers.DeepAdversarialMetricLearning(
  models=models,
  optimizers=optimizers,
  batch_size=120,
  loss_funcs=loss_funcs,
  mining_funcs=mining_funcs,
  num_epochs=50,
  iterations_per_epoch=100,
  dataset=your_dataset,
  loss_weights=loss_weights
)

# Train!
trainer.train()

The package also comes with RecordKeeper, which makes it very easy to log and save data during training. It automatically looks for special attributes within objects to log on Tensorboard, as well as to save in CSV and pickle format.

from torch.utils.tensorboard import SummaryWriter
from pytorch_metric_learning.utils import record_keeper as record_keeper_package

pickler_and_csver = record_keeper_package.PicklerAndCSVer(your_folder_for_logs)
tensorboard_writer = SummaryWriter(log_dir=your_tensorboard_folder)
record_keeper = record_keeper_package.RecordKeeper(tensorboard_writer, pickler_and_csver)

# Then during training:
recorder.update_records(your_dict_of_objects, current_iteration)

# If you are using one of the provided trainers, then just pass in the record keeper, and the update step will be taken care of.
trainer = trainers.MetricLossOnly(
  <your other args>
  record_keeper = record_keeper
  ...
)

# Now it will update the record_keeper at every iteration
trainer.train()

The nice thing about RecordKeeper is that it makes it very easy to add loggable information when you write a new loss function or miner. Just create a list named "record_these" that contains the names of the attributes you want to record.

class YourNewLossFunction(BaseMetricLossFunction):
  def __init__(self, **kwargs):
    self.avg_embedding_norm = 0
    self.some_other_useful_stat = 0
    self.record_these = ["avg_embedding_norm", "some_other_useful_stat"]
    super().__init__(**kwargs)

  def compute_loss(self, embeddings, labels, indices_tuple):
    self.avg_embedding_norm = torch.mean(torch.norm(embeddings, p=2, dim=1))
    self.some_other_useful_stat = some_cool_function(embeddings)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch_metric_learning-0.9.13.tar.gz (26.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytorch_metric_learning-0.9.13-py3-none-any.whl (42.4 kB view details)

Uploaded Python 3

File details

Details for the file pytorch_metric_learning-0.9.13.tar.gz.

File metadata

  • Download URL: pytorch_metric_learning-0.9.13.tar.gz
  • Upload date:
  • Size: 26.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for pytorch_metric_learning-0.9.13.tar.gz
Algorithm Hash digest
SHA256 33e22c202dea95a58a4a35b6c7d035e2cca1a3bfb8342546d8509e8952e0d5e2
MD5 18dd3901fc315f3df8ecedac739bfb53
BLAKE2b-256 2259d9f23e2d7403fe0598fc03d5f195eab01b899871db470c7f57f20b77da4c

See more details on using hashes here.

File details

Details for the file pytorch_metric_learning-0.9.13-py3-none-any.whl.

File metadata

  • Download URL: pytorch_metric_learning-0.9.13-py3-none-any.whl
  • Upload date:
  • Size: 42.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for pytorch_metric_learning-0.9.13-py3-none-any.whl
Algorithm Hash digest
SHA256 ee9e9fea978715d662ce0c899f19147a7f587cd456eb2a4e1db1d2525d7519f8
MD5 0d3cc696642c6ff176dad6e50b6f55b4
BLAKE2b-256 8bf7da6b41c3185d1e10168f66a8ce2c5929695aec4a44a584c5f0793dcf7c31

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page