Skip to main content

Several tools by dealing with image annotations to train YOLO or similar models

Project description

Pflow

Install

pip install -e '.[dev]'

Setup

cp .env.default .env

Run

pflows doc/examples/birds-grouped-categories.json

In this example, we defined a workflow to download the dataset from Roboflow as we can see in the image below:

Birds Dataset

The workflow is defined in a JSON file, where we define the steps to be executed:

[
  {
    "task": "roboflow_tools.download_dataset",
    "target_dir": "{{BASE_FOLDER}}/datasets/downloaded/cub200_parts-50",
    "url": "https://universe.roboflow.com/explainableai-lavbv/cub200_parts/dataset/50"
  },
  {
    "task": "yolo_v8.load_dataset",
    "folder_path": "{{BASE_FOLDER}}/datasets/downloaded/cub200_parts-50"
  },
  {
    "task": "base.count_images"
  },
  {
    "task": "base.count_categories"
  },
  {
    "task": "categories.group_categories",
    "groups": {
      "upper": [["eye", "bill", "head", "nape", "throat"]],
      "lower": [["belly", "feet", "tail"]],
      "middle": [["Wing", "breast", "back"]],
      "Wing": [["Wing"]],
      "back": [["back"]],
      "belly": [["belly"]],
      "bill": [["bill"]],
      "eye": [["eye"]],
      "feet": [["feet"]],
      "head": [["head"]],
      "nape": [["nape"]],
      "tail": [["tail"]],
      "throat": [["throat"]]
    },
    "condition": "any"
  },
  {
    "task": "categories.keep",
    "categories": ["upper", "lower", "middle"]
  },
  {
    "task": "base.count_images"
  },
  {
    "task": "base.count_categories"
  },
  {
    "task": "base.show_categories"
  },
  {
    "task": "yolo_v8.write",
    "target_dir": "{{BASE_FOLDER}}/datasets/processed/birds-grouped-categories-cub200_parts-50"
  }
]

The workflow is composed of the following steps:

  1. Download the dataset from Roboflow
  2. Load the dataset
  3. Count the number of images
  4. Count the number of categories
  5. Group the categories, to create new categories based on the existing ones (upper, lower, middle)
  6. Keep only the categories that are in the groups "upper", "lower" and "middle"
  7. Count the number of images
  8. Count the number of categories
  9. Show the categories
  10. Write the dataset to disk

As we can see, we can use the {{BASE_FOLDER}} variable to refer to the base folder of the project. These variables are defined in the .env file, which is used to configure the project.

We can see the input images and output images below:

Example Bird 1:

Bird 1 before Bird 1 before

Example Bird 2:

Bird 2 before Bird 2 before

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pflows-0.1.21.tar.gz (5.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pflows-0.1.21-py3-none-any.whl (5.4 MB view details)

Uploaded Python 3

File details

Details for the file pflows-0.1.21.tar.gz.

File metadata

  • Download URL: pflows-0.1.21.tar.gz
  • Upload date:
  • Size: 5.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.4

File hashes

Hashes for pflows-0.1.21.tar.gz
Algorithm Hash digest
SHA256 2696a86589a4906b1c48065d3fe7d0e9567aba3ee67093d5ca9d172f44cff3ac
MD5 2e3a309dcf111f0fc81e94c803e9341a
BLAKE2b-256 e6260fbf0858b4ba6dd7546c2e2f8b6f4b26c5b389f347be54cfba4e7c910bba

See more details on using hashes here.

File details

Details for the file pflows-0.1.21-py3-none-any.whl.

File metadata

  • Download URL: pflows-0.1.21-py3-none-any.whl
  • Upload date:
  • Size: 5.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.4

File hashes

Hashes for pflows-0.1.21-py3-none-any.whl
Algorithm Hash digest
SHA256 08fc14e22dfec4bdcb093e0f9099c9f8f6ff60dd60612af9e64d39571fe95709
MD5 5c40844edc70d65e14b585fe2090e9ed
BLAKE2b-256 6b090e1cf2078ab3fdc0ccd99fb75a778412c415545a9d70603b24d0c0c8da55

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page