A Unified View of Evaluation Metrics for Structured Prediction
Project description
metametric
The metametric
Python package offers a set of tools for quickly and easily defining and implementing evaluation metrics for a variety of structured prediction tasks in natural language processing (NLP) based on the framework presented in the following paper:
A Unified View of Evaluation Metrics for Structured Prediction. Yunmo Chen, William Gantt, Tongfei Chen, Aaron Steven White, and Benjamin Van Durme. EMNLP 2023.
The key features of the package include:
- A decorator for automatically defining and implementing a custom metric for an arbitrary
dataclass
. - A collection of generic components for defining arbitrary new metrics based on the framework in the paper.
- Implementations of a number of metrics for common structured prediction tasks.
To install, run:
pip install metametric
If you use this codebase in your work, please cite the following paper:
@inproceedings{metametric,
title={A Unified View of Evaluation Metrics for Structured Prediction},
author={Yunmo Chen and William Gantt and Tongfei Chen and Aaron Steven White and Benjamin {Van Durme}},
booktitle={Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing},
year={2023},
address={Singapore},
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
metametric-0.1.2.tar.gz
(15.9 kB
view hashes)
Built Distribution
metametric-0.1.2-py3-none-any.whl
(21.2 kB
view hashes)
Close
Hashes for metametric-0.1.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 16529e89bef1633dbdbe9f9ca0bdd38adfab56bbf7d137aa0b3e79e53f5745f7 |
|
MD5 | 3b37b655a41c5afe1ec2062156ccfdba |
|
BLAKE2b-256 | ef5c04eac613533529e3c71b8857fc9b14a12d2496f1294427ab6b2dd6e3dbfd |