Skip to main content

A Unified View of Evaluation Metrics for Structured Prediction

Project description

metametric

The metametric Python package offers a set of tools for quickly and easily defining and implementing evaluation metrics for a variety of structured prediction tasks in natural language processing (NLP) based on the framework presented in the following paper:

A Unified View of Evaluation Metrics for Structured Prediction. Yunmo Chen, William Gantt, Tongfei Chen, Aaron Steven White, and Benjamin Van Durme. EMNLP 2023.

The key features of the package include:

  • A decorator for automatically defining and implementing a custom metric for an arbitrary dataclass.
  • A collection of generic components for defining arbitrary new metrics based on the framework in the paper.
  • Implementations of a number of metrics for common structured prediction tasks.

To install, run:

pip install metametric

If you use this codebase in your work, please cite the following paper:

@inproceedings{metametric,
    title={A Unified View of Evaluation Metrics for Structured Prediction},
    author={Yunmo Chen and William Gantt and Tongfei Chen and Aaron Steven White and Benjamin {Van Durme}},
    booktitle={Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing},
    year={2023},
    address={Singapore},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metametric-0.1.2.tar.gz (15.9 kB view hashes)

Uploaded Source

Built Distribution

metametric-0.1.2-py3-none-any.whl (21.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page