No project description provided
Project description
ireval
This Python package provides an implementation of the most common information retrieval (IR) metrics.
Our goal is to return the same scores as trec_eval.
We achieve this by extensively comparing our implementations across many different datasets with their results.
ireval can be installed via
pip install ireval
Implemented metrics
The following metrics are currently implemented:
| Name | Function | Description |
|---|---|---|
| Precision@k | precision_at_k |
Precision is the fraction of retrieved documents that are relevant to the query. Precision@k considers only the documents with the highest k scores. |
| Precision@k% | precision_at_k_percent |
Precision is the fraction of retrieved documents that are relevant to the query. Precision@k% considers only the documents with the highest k% scores. |
| Recall@k | recall_at_k |
Recall is the fraction of the relevant documents that are successfully retrieved. Recall@k considers only the documents with the highest k scores. |
| Recall@k% | recall_at_k_percent |
Recall is the fraction of the relevant documents that are successfully retrieved. Recall@k% considers only the documents with the highest k% scores. |
| Average precision | average_precision |
Average precision is the area under the precision-recall curve. |
| R-precision | r_precision |
R-Precision is the precision after R documents have been retrieved, where R is the number of relevant documents for the topic. |
Usage
import ireval
relevancies = [1, 0, 1, 1, 0]
scores = [0.1, 0.4, 0.35, 0.8, .25]
p5 = ireval.precision_at_k(relevancies, scores, 5)
p5pct = ireval.precision_at_k_percent(relevancies, scores, 5)
r5 = ireval.recall_at_k(relevancies, scores, 5)
r5pct = ireval.recall_at_k_percent(relevancies, scores, 5)
ap = ireval.average_precision(relevancies, scores)
rprec = ireval.r_precision(relevancies, scores)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ireval-0.1.1.tar.gz.
File metadata
- Download URL: ireval-0.1.1.tar.gz
- Upload date:
- Size: 4.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.13 CPython/3.8.2 Windows/10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bebf5faec983468672ca23a9ac20611550f374723d025c2459bf9265eb374b92
|
|
| MD5 |
2172fa2e60b5802dad0a5f9164068fcf
|
|
| BLAKE2b-256 |
b8d793e4fcf369685f9d354ac2c00051e8c81f37fb9eb529e087b2e5c131eafc
|
File details
Details for the file ireval-0.1.1-py3-none-any.whl.
File metadata
- Download URL: ireval-0.1.1-py3-none-any.whl
- Upload date:
- Size: 4.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.13 CPython/3.8.2 Windows/10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ad0d3680a1bd77b1718f12626d5af4d5ccae453dacf3edfcec20af72dfd7b6b5
|
|
| MD5 |
38359343c767926b6a50f5e01d2991db
|
|
| BLAKE2b-256 |
7d4ed6a6061cae1299c7d93583d15dd4ca7409351fb8e04ce6ac2a0e676292ed
|