Skip to main content

A data processing library for computer vision datasets

Project description

DataFlow-CV

Where Vibe Coding meets CV data. ๐ŸŒŠ Convert & visualize datasets. Built with the flow of Claude Code.

Python Version License Version Development Status

A data processing library for computer vision datasets, focusing on format conversion and visualization between LabelMe, COCO, and YOLO formats. Provides both a CLI and Python API.

Table of Contents

Project Structure

dataflow/
โ”œโ”€โ”€ __init__.py              # Package exports and convenience functions
โ”œโ”€โ”€ cli.py                   # Command-line interface
โ”œโ”€โ”€ config.py                # Configuration management
โ”œโ”€โ”€ convert/                 # Format conversion module
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ base.py             # Converter base class
โ”‚   โ”œโ”€โ”€ coco_to_yolo.py     # COCO to YOLO converter
โ”‚   โ””โ”€โ”€ yolo_to_coco.py     # YOLO to COCO converter
โ”œโ”€โ”€ visualize/               # Annotation visualization module
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ base.py            # Visualizer base class
โ”‚   โ”œโ”€โ”€ yolo.py            # YOLO annotation visualizer
โ”‚   โ”œโ”€โ”€ coco.py            # COCO annotation visualizer
โ”‚   โ””โ”€โ”€ labelme.py         # LabelMe annotation visualizer
โ””โ”€โ”€ label/                   # Label format handlers module
    โ”œโ”€โ”€ __init__.py
    โ”œโ”€โ”€ yolo.py            # YOLO format handler
    โ”œโ”€โ”€ coco.py            # COCO format handler
    โ””โ”€โ”€ labelme.py         # LabelMe format handler
tests/
โ”œโ”€โ”€ __init__.py
โ”œโ”€โ”€ convert/                # Conversion tests
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ test_coco_to_yolo.py
โ”‚   โ””โ”€โ”€ test_yolo_to_coco.py
โ”œโ”€โ”€ visualize/              # Visualization tests
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ test_yolo.py
โ”‚   โ”œโ”€โ”€ test_coco.py
โ”‚   โ”œโ”€โ”€ test_labelme.py
โ”‚   โ””โ”€โ”€ test_generic.py    # Generic visualizer tests
โ”œโ”€โ”€ run_tests.py           # Test runner
samples/
โ”œโ”€โ”€ __init__.py
โ”œโ”€โ”€ example_usage.py       # Quick usage demonstration
โ”œโ”€โ”€ cli/                   # CLI usage examples
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ convert/
โ”‚   โ”‚   โ”œโ”€โ”€ cli_coco_to_yolo.py
โ”‚   โ”‚   โ””โ”€โ”€ cli_yolo_to_coco.py
โ”‚   โ””โ”€โ”€ visualize/
โ”‚       โ”œโ”€โ”€ cli_yolo.py
โ”‚       โ”œโ”€โ”€ cli_coco.py
โ”‚       โ””โ”€โ”€ cli_labelme.py
โ””โ”€โ”€ api/                   # Python API examples
    โ”œโ”€โ”€ __init__.py
    โ”œโ”€โ”€ convert/
    โ”‚   โ”œโ”€โ”€ api_coco_to_yolo.py
    โ”‚   โ””โ”€โ”€ api_yolo_to_coco.py
    โ””โ”€โ”€ visualize/
        โ”œโ”€โ”€ api_yolo.py
        โ”œโ”€โ”€ api_coco.py
        โ””โ”€โ”€ api_labelme.py

Requirements

Core Dependencies

  • Python 3.8 or higher
  • Linux environment (POSIX compatible, assumes POSIX paths)
  • click >= 8.1.0 โ€“ CLI framework
  • numpy >= 2.0.0 โ€“ numerical operations
  • opencv-python >= 4.8.0 โ€“ image processing (optional, used for some image operations)
  • Pillow >= 10.0.0 โ€“ image reading (optional, used for reading image dimensions)

Quick Start

Installation

# Regular installation from source
pip install .

# Editable installation (development mode)
# Due to setuptools compatibility, use python setup.py develop (not pip install -e .)
python setup.py develop
# After editable installation, use python -m dataflow.cli instead of the dataflow command

Command Line Usage

Global options: --verbose (-v) for progress output, --overwrite to replace existing files.

# COCO to YOLO conversion (use --segmentation for polygon annotations)
dataflow convert coco2yolo annotations.json output_dir/
dataflow convert coco2yolo annotations.json output_dir/ --segmentation

# YOLO to COCO conversion
dataflow convert yolo2coco images/ labels/ classes.names output.json

# Visualize YOLO annotations (use --save to export images)
dataflow visualize yolo images/ labels/ classes.names
dataflow visualize yolo images/ labels/ classes.names --save output_dir/

# Visualize COCO annotations (use --save to export images)
dataflow visualize coco images/ annotations.json
dataflow visualize coco images/ annotations.json --save output_dir/

# Visualize LabelMe annotations (use --save to export images)
dataflow visualize labelme images/ labels/
dataflow visualize labelme images/ labels/ --save output_dir/

# Show configuration
dataflow config

# Get help
dataflow --help
dataflow convert coco2yolo --help
dataflow visualize yolo --help
dataflow visualize labelme --help

See the CLI Reference below for detailed usage.

Python API Usage

import dataflow

# COCO to YOLO conversion (pass segmentation=True for polygon annotations)
result = dataflow.coco_to_yolo("annotations.json", "output_dir")
result = dataflow.coco_to_yolo("annotations.json", "output_dir", segmentation=True)
print(f"Processed {result['images_processed']} images")

# YOLO to COCO conversion
result = dataflow.yolo_to_coco("images/", "labels/", "classes.names", "output.json")
print(f"Generated {result['annotations_processed']} annotations")

# Visualize YOLO annotations (save_dir is optional)
result = dataflow.visualize_yolo("images/", "labels/", "classes.names")
result = dataflow.visualize_yolo("images/", "labels/", "classes.names", save_dir="output_dir/")
print(f"Visualized {result['images_processed']} images")

# Visualize COCO annotations (save_dir is optional)
result = dataflow.visualize_coco("images/", "annotations.json")
result = dataflow.visualize_coco("images/", "annotations.json", save_dir="output_dir/")
print(f"Visualized {result['images_processed']} images")

# Visualize LabelMe annotations (save_dir is optional)
result = dataflow.visualize_labelme("images/", "labels/")
result = dataflow.visualize_labelme("images/", "labels/", save_dir="output_dir/")
print(f"Visualized {result['images_processed']} images")
print(f"Classes found: {result['classes_found']}")

CLI Reference

The CLI follows a hierarchical structure: dataflow <mainโ€‘task> <subโ€‘task> [arguments]. Global options can be placed before the main task.

Global Options

  • --verbose, -v: Enable verbose output (progress information)
  • --overwrite: Overwrite existing files

Conversion Commands

COCO to YOLO

dataflow convert coco2yolo COCO_JSON_PATH OUTPUT_DIR [--segmentation]
  • COCO_JSON_PATH: Path to COCO JSON annotation file
  • OUTPUT_DIR: Directory where labels/ and class.names will be created
  • --segmentation, -s: Handle segmentation annotations (polygon format)

YOLO to COCO

dataflow convert yolo2coco IMAGE_DIR YOLO_LABELS_DIR YOLO_CLASS_PATH COCO_JSON_PATH
  • IMAGE_DIR: Directory containing image files
  • YOLO_LABELS_DIR: Directory containing YOLO label files (.txt)
  • YOLO_CLASS_PATH: Path to YOLO class names file (e.g., class.names)
  • COCO_JSON_PATH: Path to save COCO JSON file

Visualization Commands

Visualize YOLO annotations

dataflow visualize yolo IMAGE_DIR LABEL_DIR CLASS_PATH [--save SAVE_DIR]
  • IMAGE_DIR: Directory containing image files
  • LABEL_DIR: Directory containing YOLO label files (.txt)
  • CLASS_PATH: Path to class names file (e.g., class.names)
  • --save SAVE_DIR: Optional directory to save visualized images

Visualize COCO annotations

dataflow visualize coco IMAGE_DIR ANNOTATION_JSON [--save SAVE_DIR]
  • IMAGE_DIR: Directory containing image files
  • ANNOTATION_JSON: Path to COCO JSON annotation file
  • --save SAVE_DIR: Optional directory to save visualized images

Visualize LabelMe annotations

dataflow visualize labelme IMAGE_DIR LABEL_DIR [--save SAVE_DIR]
  • IMAGE_DIR: Directory containing image files
  • LABEL_DIR: Directory containing LabelMe JSON files
  • --save SAVE_DIR: Optional directory to save visualized images

Configuration Command

dataflow config

Shows the current configuration (file extensions, default values, CLI context).

Getting Help

dataflow --help
dataflow convert --help
dataflow convert coco2yolo --help
dataflow convert yolo2coco --help
dataflow visualize --help
dataflow visualize yolo --help
dataflow visualize coco --help
dataflow visualize labelme --help

Segmentation Support

DataFlow-CV supports both bounding box and polygon segmentation annotations across all formats:

YOLO Segmentation Format

  • Detection format: class_id x_center y_center width height (normalized coordinates)
  • Segmentation format: class_id x1 y1 x2 y2 ... (polygon vertices, normalized)
  • YOLO segmentation files have the same .txt extension as detection files

COCO Segmentation Format

  • Polygon coordinates in segmentation field (list of [x1, y1, x2, y2, ...])
  • Both single-polygon and multi-polygon annotations are supported

LabelMe Segmentation Format

  • Rectangle shapes (shape_type: "rectangle") for bounding box annotations
  • Polygon shapes (shape_type: "polygon") for segmentation annotations
  • Each JSON file contains shapes array with annotation data

Usage Examples

# Convert COCO to YOLO with segmentation annotations
dataflow convert coco2yolo annotations.json output_dir/ --segmentation

# Visualize YOLO annotations in strict segmentation mode (only polygons)
dataflow visualize yolo images/ labels/ classes.names --segmentation

# Visualize COCO annotations in strict segmentation mode
dataflow visualize coco images/ annotations.json --segmentation

# Visualize LabelMe annotations in strict segmentation mode (only polygons)
dataflow visualize labelme images/ labels/ --segmentation

Python API

# Convert COCO to YOLO with segmentation
result = dataflow.coco_to_yolo("annotations.json", "output_dir", segmentation=True)

# Visualize in strict segmentation mode
result = dataflow.visualize_yolo("images/", "labels/", "classes.names", segmentation=True)
result = dataflow.visualize_labelme("images/", "labels/", segmentation=True)

Notes

  • Without the --segmentation flag, both bounding boxes and polygons are processed automatically
  • With --segmentation flag, only valid polygon annotations are processed (strict mode)
  • YOLO segmentation format requires at least 3 points (6 coordinates)
  • COCO segmentation polygons are automatically converted to YOLO normalized coordinates
  • LabelMe format supports both rectangle (shape_type: "rectangle") and polygon (shape_type: "polygon") shapes
  • In segmentation mode, LabelMe visualizer rejects rectangle shapes and only accepts polygon shapes

Running Tests

# Run all tests
python tests/run_tests.py

# Run specific test
python tests/run_tests.py --test TestCocoToYoloConverter

# With verbose output
python tests/run_tests.py -v

Examples

Check the samples/ directory for detailed usage examples:

  • samples/cli/convert/ - CLI conversion examples
  • samples/cli/visualize/ - CLI visualization examples
  • samples/api/convert/ - Python API conversion examples
  • samples/api/visualize/ - Python API visualization examples

License

MIT License ยฉ 2026 zjykzj

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dataflow_cv-0.2.1.tar.gz (38.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dataflow_cv-0.2.1-py3-none-any.whl (48.8 kB view details)

Uploaded Python 3

File details

Details for the file dataflow_cv-0.2.1.tar.gz.

File metadata

  • Download URL: dataflow_cv-0.2.1.tar.gz
  • Upload date:
  • Size: 38.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.1

File hashes

Hashes for dataflow_cv-0.2.1.tar.gz
Algorithm Hash digest
SHA256 409b778e78c1318011daf4aaf600445e0e2f90919b4b082562c2db7c9af474c7
MD5 0b06881d90ea0c7b77243cec58a31166
BLAKE2b-256 d7aac105be444f0c476189b217eabfaa08d4c55905c2d9d8c8c1c2a69fdac157

See more details on using hashes here.

File details

Details for the file dataflow_cv-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: dataflow_cv-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 48.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.1

File hashes

Hashes for dataflow_cv-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0c38d47d6c3c0e681c66a07b6764186921306119c7833a9fa2d1fbfe206fe63d
MD5 7d69af42c36404fde3525b5bde4320e8
BLAKE2b-256 859b6c8605f6ec59f0e279505b665d152e875be715d622b2a585510ffe757d10

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page