Skip to main content

Learning to Rank with PyTorch (Fork of pytorchltr)

Project description

PyTorch Learning to Rank (LTR)

This is a library for Learning to Rank (LTR) with PyTorch. The goal of this library is to support the infrastructure necessary for performing LTR experiments in PyTorch.

This is a fork of the original pytorchltr. It add fix and updates to allow it to work with Python >=3.10.

Installation

In your virtualenv simply run:

pip install pytorchltr2

Note that this library requires Python 3.10 or higher.

Documentation

Original documentation is available here.

Example

See examples/01-basic-usage.py for a more complete example including evaluation

import torch
from pytorchltr.datasets import Example3
from pytorchltr.loss import PairwiseHingeLoss

# Load dataset
train = Example3(split="train")
collate_fn = train.collate_fn()

# Setup model, optimizer and loss
model = torch.nn.Linear(train[0].features.shape[1], 1)
optimizer = torch.optim.SGD(model.parameters(), lr=0.1)
loss = PairwiseHingeLoss()

# Train for 3 epochs
for epoch in range(3):
    loader = torch.utils.data.DataLoader(train, batch_size=2, collate_fn=collate_fn)
    for batch in loader:
        xs, ys, n = batch.features, batch.relevance, batch.n
        l = loss(model(xs), ys, n).mean()
        optimizer.zero_grad()
        l.backward()
        optimizer.step()

Dataset Disclaimer

This library provides utilities to automatically download and prepare several public LTR datasets. We cannot vouch for the quality, correctness or usefulness of these datasets. We do not host or distribute these datasets and it is ultimately your responsibility to determine whether you have permission to use each dataset under its respective license.

Citing

If you find this software useful for your research, please cite the publication for the original pytorchltr.

@inproceedings{jagerman2020accelerated,
    author = {Jagerman, Rolf and de Rijke, Maarten},
    title = {Accelerated Convergence for Counterfactual Learning to Rank},
    year = {2020},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    booktitle = {Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval},
    doi = {10.1145/3397271.3401069},
    series = {SIGIR’20}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorchltr2-0.2.3.tar.gz (16.5 kB view details)

Uploaded Source

File details

Details for the file pytorchltr2-0.2.3.tar.gz.

File metadata

  • Download URL: pytorchltr2-0.2.3.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pytorchltr2-0.2.3.tar.gz
Algorithm Hash digest
SHA256 be6506a84438b402ebe5ed5e371768973d452a308d6cfc197ae6c5a581cf1599
MD5 16bc949e0b8ef66458fcf34e2b4a89cb
BLAKE2b-256 0114168c1fa4097a5349394c5a037f6f1412ee89331c2985df99026abaca064f

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytorchltr2-0.2.3.tar.gz:

Publisher: python-publish.yml on akreuzer/pytorchltr

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page