Skip to main content

Learning to Rank with PyTorch

Project description

PyTorch Learning to Rank (LTR)

Build Documentation Coverage CodeFactor License

This is a library for Learning to Rank (LTR) with PyTorch. The goal of this library is to support the infrastructure necessary for performing LTR experiments in PyTorch.

Installation

In your virtualenv simply run:

pip install pytorchltr 

Note that this library requires Python 3.5 or higher.

Documentation

Documentation is available here.

Example

See examples/01-basic-usage.py for a more complete example including evaluation

import torch
from pytorchltr.datasets import Example3
from pytorchltr.loss import PairwiseHingeLoss

# Load dataset
train = Example3(split="train")
collate_fn = train.collate_fn()

# Setup model, optimizer and loss
model = torch.nn.Linear(train[0].features.shape[1], 1)
optimizer = torch.optim.SGD(model.parameters(), lr=0.1)
loss = PairwiseHingeLoss()

# Train for 3 epochs
for epoch in range(3):
    loader = torch.utils.data.DataLoader(train, batch_size=2, collate_fn=collate_fn)
    for batch in loader:
        xs, ys, n = batch.features, batch.relevance, batch.n
        l = loss(model(xs), ys, n).mean()
        optimizer.zero_grad()
        l.backward()
        optimizer.step()

Dataset Disclaimer

This library provides utilities to automatically download and prepare several public LTR datasets. We cannot vouch for the quality, correctness or usefulness of these datasets. We do not host or distribute these datasets and it is ultimately your responsibility to determine whether you have permission to use each dataset under its respective license.

Citing

If you find this software useful for your research, we kindly ask you to cite the following publication:

@inproceedings{jagerman2020accelerated,
    author = {Jagerman, Rolf and de Rijke, Maarten},
    title = {Accelerated Convergence for Counterfactual Learning to Rank},
    year = {2020},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    booktitle = {Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval},
    doi = {10.1145/3397271.3401069},
    series = {SIGIR’20}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorchltr-0.2.1.tar.gz (144.0 kB view hashes)

Uploaded Source

Built Distributions

pytorchltr-0.2.1-cp38-cp38-win_amd64.whl (106.5 kB view hashes)

Uploaded CPython 3.8 Windows x86-64

pytorchltr-0.2.1-cp38-cp38-manylinux1_x86_64.whl (345.4 kB view hashes)

Uploaded CPython 3.8

pytorchltr-0.2.1-cp38-cp38-macosx_10_14_x86_64.whl (106.8 kB view hashes)

Uploaded CPython 3.8 macOS 10.14+ x86-64

pytorchltr-0.2.1-cp37-cp37m-win_amd64.whl (105.4 kB view hashes)

Uploaded CPython 3.7m Windows x86-64

pytorchltr-0.2.1-cp37-cp37m-manylinux1_x86_64.whl (335.2 kB view hashes)

Uploaded CPython 3.7m

pytorchltr-0.2.1-cp37-cp37m-macosx_10_14_x86_64.whl (107.4 kB view hashes)

Uploaded CPython 3.7m macOS 10.14+ x86-64

pytorchltr-0.2.1-cp36-cp36m-win_amd64.whl (105.4 kB view hashes)

Uploaded CPython 3.6m Windows x86-64

pytorchltr-0.2.1-cp36-cp36m-manylinux1_x86_64.whl (336.3 kB view hashes)

Uploaded CPython 3.6m

pytorchltr-0.2.1-cp36-cp36m-macosx_10_14_x86_64.whl (107.2 kB view hashes)

Uploaded CPython 3.6m macOS 10.14+ x86-64

pytorchltr-0.2.1-cp35-cp35m-win_amd64.whl (104.8 kB view hashes)

Uploaded CPython 3.5m Windows x86-64

pytorchltr-0.2.1-cp35-cp35m-manylinux1_x86_64.whl (332.5 kB view hashes)

Uploaded CPython 3.5m

pytorchltr-0.2.1-cp35-cp35m-macosx_10_15_x86_64.whl (106.3 kB view hashes)

Uploaded CPython 3.5m macOS 10.15+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page