Skip to main content

A collection of common recommendation system metrics

Project description

Recommender metrics

This is a collection of commonly used recommendation system (RS) metrics. As fairness in RS is becoming increasingly important, it is also extended by functions to ease computing the differences of RS performances for different user groups, e.g., gender.

The following metrics are supported (all with the cut-off threshold k):

Notes:
* Averaging average precision and reciprocal rank of multiple samples leads to mean average precision (MAP) and mean reciprocal rank (MRR), respectively, which are often used in research.

Installation

  • Install via pip: python -m pip install rmet

  • Or from source: python -m pip install .

Usage metrics

To compute the metrics, simply call them with your model's output, the true (known) interactions and some cut-off value k:

from rmet import ndcg
ndcg(model_output, targets, k=10)

Note: Coverage does not require the targets attribute.

To compute multiple metrics with a single call, check out the calculate function, which accepts a list of metrics to compute:

from rmet import calculate

calculate(
    metrics=["ndcg", "recall"], 
    logits=model_output, 
    targets=targets, 
    k=10,
    return_individual=False,
    flatten_results=True,
)

Sample output:

{
 'ndcg@10': 0.479,
 'recall@10': 0.350
}

If return_individual is set, the metrics are also returned on sample level, e.g., for every user, when possible.

Further, calculate allows the efficient computation of metrics for multiple cutoff thresholds k, by simply providing a list of numbers instead.

Please check out the functions docstring for the full feature description.

Usage metric differences for user features

One can also instantiate the UserFeature class for some demographic user feature, such that the performance difference of RS on for different users can be evaluated, e.g., for male and female users in the context of gender.

To do so, you first need to specify which feature belongs to which user via the UserGroup class and then simply call calculate_for_group similar to calculate above.

from rmet import UserFeature, calculate_for_feature
ug_gender = UserFeature("gender", ["m", "m", "f", "d", "m"])

calculate_for_feature(
    ug_gender, 
    metrics=["ndcg", "recall"], 
    logits=model_output, 
    targets=targets, 
    k=10,
    return_individual=False,
    flatten_results=True,
)

Sample output:

{
    'gender_f': {'ndcg@10': 0.195, 'recall@10': 0.125},
    'gender_m': {'ndcg@10': 0.779, 'recall@10': 0.733},
    'gender_d': {'ndcg@10': 0.390, 'recall@10': 0.458},
    'gender_f-m': {'ndcg@10': -0.584, 'recall@10': -0.608},
    'gender_f-d': {'ndcg@10': -0.195, 'recall@10': -0.333},
    'gender_m-d': {'ndcg@10': 0.388, 'recall@10': 0.275}
}

License

MIT License - see the LICENSE file for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rmet-0.1.4.tar.gz (12.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rmet-0.1.4-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file rmet-0.1.4.tar.gz.

File metadata

  • Download URL: rmet-0.1.4.tar.gz
  • Upload date:
  • Size: 12.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for rmet-0.1.4.tar.gz
Algorithm Hash digest
SHA256 0d525dd2a46d218a35b4f59c9e8f8a8f95970917a4f30c7b7867b37979a4c1c5
MD5 387a7f39f3c7f48f68512a377cf12ff7
BLAKE2b-256 b9dec786b6dcfa2ee3934329e88d0c7579daf515267aa7285a834eda3d71b75e

See more details on using hashes here.

Provenance

The following attestation bundles were made for rmet-0.1.4.tar.gz:

Publisher: publish-to-pypi.yaml on Tigxy/recommender-metrics

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file rmet-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: rmet-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 10.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for rmet-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 5da2fac91e31b4198d569ab13d486b008861370aa0a183373afc695158e66e8c
MD5 5c9c6f054eae6c30fb7a9deb4ac8defd
BLAKE2b-256 dc258e51c53efc9a51767caeb40035836cf94bf71a544062ee001f1800b6b879

See more details on using hashes here.

Provenance

The following attestation bundles were made for rmet-0.1.4-py3-none-any.whl:

Publisher: publish-to-pypi.yaml on Tigxy/recommender-metrics

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page