Skip to main content

PyTerrier components for T5 ranking

Project description

PyTerrier_t5

This is the PyTerrier plugin for the Mono and Duo T5 ranking approaches [Nogueira21].

Note that this package only supports scoring from a pretrained models (like this one).

Installation

This repostory can be installed using Pip.

pip install --upgrade git+https://github.com/terrierteam/pyterrier_t5.git

Building T5 pipelines

You can use MonoT5 just like any other text-based re-ranker. By default, it uses a MonoT5 model previously trained on MS MARCO passage ranking training queries.

import pyterrier as pt
from pyterrier_t5 import MonoT5ReRanker, DuoT5ReRanker
monoT5 = MonoT5ReRanker() # loads castorini/monot5-base-msmarco by default
duoT5 = DuoT5ReRanker() # loads castorini/duot5-base-msmarco by default

dataset = pt.get_dataset("irds:vaswani")
bm25 = pt.BatchRetrieve(pt.get_dataset("vaswani").get_index(), wmodel="BM25")
mono_pipeline = bm25 >> pt.text.get_text(dataset, "text") >> monoT5
duo_pipeline = mono_pipeline % 5 >> duoT5 # apply a rank cutoff of 5 from monoT5 since duoT5 is too costly to run over the full result list

Note that both approaches require the document text to be included in the dataframe (see pt.text.get_text).

MonoT5ReRanker and DuoT5ReRanker have the following options:

  • model (default: 'castorini/monot5-base-msmarco' for mono, 'castorini/duot5-base-msmarco' for duo). HGF model name. Defaults to a version trained on MS MARCO passage ranking.
  • tok_model (default: 't5-base'). HGF tokenizer name.
  • batch_size (default: 4). How many documents to process at the same time.
  • text_field (default: text). The dataframe attribute in which the document text is stored.
  • verbose (default: True). Show progress bar.

Examples

Checkout out the notebooks, even on Colab:

Implementation Details

We use a PyTerrier transformer to score documents using a T5 model.

Sequences longer than the model's maximum of 512 tokens are silently truncated. Consider splitting long texts into passages and aggregating the results (examples).

References

Credits

  • Sean MacAvaney, University of Glasgow

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyterrier_t5-0.1.0.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyterrier_t5-0.1.0-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file pyterrier_t5-0.1.0.tar.gz.

File metadata

  • Download URL: pyterrier_t5-0.1.0.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for pyterrier_t5-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e106e5ae4a6879485a55c5cac7ac3c999779d300340718c7a1026a7079a3b3e3
MD5 9bf849c6803a1abd0c7a4bd574d85e5e
BLAKE2b-256 7ad9cd1f379b5c5e23f60fb200d91b94270caae4ed906d23984d772a8aa04685

See more details on using hashes here.

File details

Details for the file pyterrier_t5-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: pyterrier_t5-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 13.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for pyterrier_t5-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d813aac2faa0c0ee3fdb38b6e993daff634bf0b10480850a5081e72cf0fa1b93
MD5 27c4cd754b35dfe7b1511d33f35bb7eb
BLAKE2b-256 d8e31486241bd5f5ab05291cc0df24381ea7789597805d18ec4dcf427834f105

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page