Skip to main content

Set of metrics to assess behavior annotations of videos.

Project description

Behavior Annotation Score (BANOS)

GitHub Repo | Installation | Example Usage

PyPI version Downloads View on File Exchange License: MIT

Overview

This library is designed to calculate the Behavior Annotation Score (BANOS) for behavior annotations in video data, with implementation available in Python and Matlab. BANOS is a set of metrics designed to evaluate algorithmic annotations against a ground truth, integrating aspects of accuracy, overlap, temporal precision, and continuity of behavior annotations segments, essential for researchers and practitioners in ethology and computer vision.

Ethological Context and Key Concepts

Background

In ethology (the science of animal behavior), the automatic annotation of behaviors from video data faces specific challenges for precise, contextually relevant annotations. Traditional metrics often focus on specific aspects of annotation performance, such as accuracy in a narrow sense, which may not fully encompass the ethological significance or practical applicability of an algorithm's output. There is a need for a more comprehensive metric framework that considers the diverse aspects of behavior annotation quality, providing a more complete picture of an algorithm's effectiveness in real-world scenarios.

Introducing the Behavior Annotation Score (BANOS)

The BANOS is a set of metrics tailored for evaluating algorithmic behavior annotations against ground truths (typically human annotations), integrating multiple facets of accuracy to provide a comprehensive assessment.

BANOS Metrics Formulas

BANOS consists of the following metrics, all ranging from 0 (lowest score) to 1 (highest score), each with specific formulas and offering disctinct perspective of an algorithm:

  1. Detection Accuracy (DA)

    • Assess the accuracy of detecting behavioral segments with Precision, Recall and F1 score.
    • Precision (P): TP/(TP+FP)
    • Recall (R): TP/(TP+FN)
    • F1 Score: (2xPxR)/(P+R)
  2. Segment Overlap (SO)

    • Assess the temporal overlap quality of for each annotated segment with temporal Intersection over Union (tIoU).
    • Temporal Intersection over Union (tIoU): Intersection of Predicted and Ground Truth Segments/Union of Predicted and Ground Truth Segments
  3. Temporal Precision (TP)

    • Asses the precision in predicting the start and end times of segments with the absolute differences between predicted and actual segment timings.
    • Temporal Precision: 1/(1 + Absolute Start Time Deviation + Absolute End Time Deviation)
  4. Intra-bout Continuity (IC)

    • Assess the consistency of annotation within each segment by counting the number of annotation switches within a segment.
    • Intra-bout Label Consistency: 1 - (Number of Label Switches within Segment / Segment Length)

Installation

Install the BANOS package directly from PyPI: :

pip install BANOS

Python Dependencies

  • pandas: This dependency should be automatically installed when you install BANOS from PyPI.

Usage

Prepare your data as a dictionary where keys are file names, and values are tuples of prediction and ground truth DataFrames. Each DataFrame should have logical binary values with columns representing different behaviors.

Example

# Loading data and using the library
import pandas as pd

data_dict = {
    'file1': (pd.read_csv('predictions_file1.csv'), pd.read_csv('groundtruth_file1.csv')),
    'file2': (pd.read_csv('predictions_file2.csv'), pd.read_csv('groundtruth_file2.csv')),
    # ... more files ...
}

preprocessed_data, dropped_info = preprocess_data(data_dict)
banos_metrics = calculate_banos_for_each_file(preprocessed_data)
group_metrics, overall_metrics = aggregate_metrics(banos_metrics)

print("Group Metrics:", group_metrics)
print("Overall Metrics:", overall_metrics)

Contributions and Support

For contributions or support, please open a pull request or issue in the GitHub repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

BANOS-0.1.5.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

BANOS-0.1.5-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file BANOS-0.1.5.tar.gz.

File metadata

  • Download URL: BANOS-0.1.5.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for BANOS-0.1.5.tar.gz
Algorithm Hash digest
SHA256 f2af0d6da42c2fe409bd2bc085a083a0b577dae6efcfb51cfc2355f674ac4810
MD5 660fc06f0e6771c12a5eaf4d22feadf7
BLAKE2b-256 bf71d17c71a1df463a914f5545c43ee8728955cb9faa04ce6874b9614e3fca13

See more details on using hashes here.

File details

Details for the file BANOS-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: BANOS-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for BANOS-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 b57dc1c9cb15991ebbe7fb927178573e57ad24d1b0e20994b2cf1c9a957c2f78
MD5 805ac82b10692f51c9946ce6fc743648
BLAKE2b-256 5d04461cee8339981b2c505e322b3cd1389b368872de9458b77259c559a908ef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page