A Unified View of Evaluation Metrics for Structured Prediction
Project description
metametric
The metametric
Python package offers a set of tools for quickly and easily defining and implementing evaluation metrics for a variety of structured prediction tasks in natural language processing (NLP) based on the framework presented in the following paper:
A Unified View of Evaluation Metrics for Structured Prediction. Yunmo Chen, William Gantt, Tongfei Chen, Aaron Steven White, and Benjamin Van Durme. EMNLP 2023.
The key features of the package include:
- A decorator for automatically defining and implementing a custom metric for an arbitrary
dataclass
. - A collection of generic components for defining arbitrary new metrics based on the framework in the paper.
- Implementations of a number of metrics for common structured prediction tasks.
To install, run:
pip install metametric
If you use this codebase in your work, please cite the following paper:
@inproceedings{metametric,
title={A Unified View of Evaluation Metrics for Structured Prediction},
author={Yunmo Chen and William Gantt and Tongfei Chen and Aaron Steven White and Benjamin {Van Durme}},
booktitle={Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing},
year={2023},
address={Singapore},
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
metametric-0.1.1.tar.gz
(15.9 kB
view hashes)
Built Distribution
metametric-0.1.1-py3-none-any.whl
(21.2 kB
view hashes)
Close
Hashes for metametric-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 95ef64647f4f08479bf52c3d985308205a5beea2869609c0e1cee2ccfb7c3493 |
|
MD5 | 82e4405fa1375eeed649fe37a4551870 |
|
BLAKE2b-256 | ed98a6df19c32c967953be740f6bda8e9b4a7a959539f91ab6b81ee0530eb3d1 |