Skip to main content

Evaluation metrics

Project description

label-studio-evalme

Evaluation metrics package

Installation

Simple installation from PyPI

pip install label-studio-evalme
Other installation methods
Pip from source
```bash
# with git
pip install git+https://github.com/heartexlabs/label-studio-evalme.git@master
```

What is Evalme?

Evalme is a collection of Label Studio evaluation metric implementations and an easy-to-use API to create custom metrics. It offers:

  • A standardized interface to increase reproducibility
  • Reduced boilerplate
  • Optimized metrics for Label Studio

Get started with Evalme

You can use Evalme with any Label Studio versions or with Label Studio Enterprise.

Load existing data from Label Studio

Use the Label Studio REST API to load existing data from your instance of Label Studio or Label Studio Enterprise.

Specify your Label Studio URL, access token and project ID in the parameters:

from evalme.matcher import Matcher

loader = Matcher(url="http://127.0.0.1:8000",
                 token="ACCESS_TOKEN",
                 project='1')
loader.refresh()

You can also load data from exported annotation files from Label Studio, exported using the API or the Label Studio UI:

from evalme.matcher import Matcher

loader = Matcher()
loader.load('your_filename')

After you load data, it is available in the _raw_data field.

Built-in metrics

By default there is a naive metric object. It evaluates annotation differences with a naive approach: if an object is fully equal to another one, the evaluation method returns 1, otherwise it returns 0.

To use the built-in metrics, do the following:

from evalme.matcher import Matcher

loader = Matcher()
loader.load('your_filename')
# Run agreement_matrix method to get matrix for all your annotations
matrix = loader.agreement_matrix()
# print result
print(matrix)

Implement your own metric

You can implement your own metric by creating an evaluation function and registering it in Metrics class.

For example, create an evaluation function with 2 parameters for compared objects:

from evalme.matcher import Matcher
# write your own evaluation function or use existing one
def naive(x, y):
	"""
    Naive comparison of annotations
    """
    if len(x) != len(y):
        result = 0
    else:
        for i in range(len(x)):
            if x[i]['value'] != y[i]['value']:
                result = 0
                break
        else:
            result = 1
    return result
# Register it in Metrics object
Metrics.register(
    name='naive',
    form=None,
    tag='all',
    func=naive,
    desc='Naive comparison of result dict'
)
# create Matcher object from previous example
loader = Matcher()
loader.load('your_filename')
matrix = loader.agreement_matrix(metric_name='naive')
# print result
print(matrix)

Contribute!

The Label Studio team is hard at work adding even more metrics, but we're looking for incredible contributors like you to submit new metrics and improve existing ones!

Join our Slack community to get help becoming a contributor!

Community

For help or questions, join our huge community on Slack!

License

Please observe the MIT License that is listed in this repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

label-studio-evalme-0.0.18.tar.gz (31.9 kB view details)

Uploaded Source

Built Distribution

label_studio_evalme-0.0.18-py3-none-any.whl (35.6 kB view details)

Uploaded Python 3

File details

Details for the file label-studio-evalme-0.0.18.tar.gz.

File metadata

  • Download URL: label-studio-evalme-0.0.18.tar.gz
  • Upload date:
  • Size: 31.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.6.9

File hashes

Hashes for label-studio-evalme-0.0.18.tar.gz
Algorithm Hash digest
SHA256 8c16f869fdf2f0d464bde06aeeb218fff7b3f377ec078c0264c4edce40147549
MD5 641440de669f8a796068d350fb4a0621
BLAKE2b-256 54e3af043c85880d51224ab0ff51608986e621f8a0611609f98917909f825177

See more details on using hashes here.

File details

Details for the file label_studio_evalme-0.0.18-py3-none-any.whl.

File metadata

  • Download URL: label_studio_evalme-0.0.18-py3-none-any.whl
  • Upload date:
  • Size: 35.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.6.9

File hashes

Hashes for label_studio_evalme-0.0.18-py3-none-any.whl
Algorithm Hash digest
SHA256 8f0b40f998dedd3f57e14398cf720c3b3a92e785e2bf5f48a1e9477f87825c5d
MD5 77e1da4009119b8c68a59eb6dbde9a9d
BLAKE2b-256 f5b5caf3b52051e0d7b88044ad76c080038735d86bd5dc99e4ab242ad6cb6b34

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page