Skip to main content

No project description provided

Project description

ireval

This Python package provides an implementation of the most common information retrieval (IR) metrics. Our goal is to return the same scores as trec_eval. We achieve this by extensively comparing our implementations across many different datasets with their results. ireval can be installed via

pip install ireval

Implemented metrics

The following metrics are currently implemented:

Name Function Description
Precision@k precision_at_k Precision is the fraction of retrieved documents that are relevant to the query. Precision@k considers only the documents with the highest k scores.
Precision@k% precision_at_k_percent Precision is the fraction of retrieved documents that are relevant to the query. Precision@k% considers only the documents with the highest k% scores.
Recall@k recall_at_k Recall is the fraction of the relevant documents that are successfully retrieved. Recall@k considers only the documents with the highest k scores.
Recall@k% recall_at_k_percent Recall is the fraction of the relevant documents that are successfully retrieved. Recall@k% considers only the documents with the highest k% scores.
Average precision average_precision Average precision is the area under the precision-recall curve.
R-precision r_precision R-Precision is the precision after R documents have been retrieved, where R is the number of relevant documents for the topic.

Usage

import ireval

relevancies = [1, 0, 1, 1, 0]
scores = [0.1, 0.4, 0.35, 0.8, .25]

p5 = ireval.precision_at_k(relevancies, scores, 5)
p5pct = ireval.precision_at_k_percent(relevancies, scores, 5)

r5 = ireval.recall_at_k(relevancies, scores, 5)
r5pct = ireval.recall_at_k_percent(relevancies, scores, 5)

ap = ireval.average_precision(relevancies, scores)
rprec = ireval.r_precision(relevancies, scores)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ireval-0.1.1.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

ireval-0.1.1-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file ireval-0.1.1.tar.gz.

File metadata

  • Download URL: ireval-0.1.1.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.8.2 Windows/10

File hashes

Hashes for ireval-0.1.1.tar.gz
Algorithm Hash digest
SHA256 bebf5faec983468672ca23a9ac20611550f374723d025c2459bf9265eb374b92
MD5 2172fa2e60b5802dad0a5f9164068fcf
BLAKE2b-256 b8d793e4fcf369685f9d354ac2c00051e8c81f37fb9eb529e087b2e5c131eafc

See more details on using hashes here.

File details

Details for the file ireval-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: ireval-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 4.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.8.2 Windows/10

File hashes

Hashes for ireval-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ad0d3680a1bd77b1718f12626d5af4d5ccae453dacf3edfcec20af72dfd7b6b5
MD5 38359343c767926b6a50f5e01d2991db
BLAKE2b-256 7d4ed6a6061cae1299c7d93583d15dd4ca7409351fb8e04ce6ac2a0e676292ed

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page