Skip to main content

Several tools by dealing with image annotations to train YOLO or similar models

Project description

Pflow

Install

pip install -e '.[dev]'

Setup

cp .env.default .env

Run

pflows doc/examples/birds-grouped-categories.json

In this example, we defined a workflow to download the dataset from Roboflow as we can see in the image below:

Birds Dataset

The workflow is defined in a JSON file, where we define the steps to be executed:

[
  {
    "task": "roboflow_tools.download_dataset",
    "target_dir": "{{BASE_FOLDER}}/datasets/downloaded/cub200_parts-50",
    "url": "https://universe.roboflow.com/explainableai-lavbv/cub200_parts/dataset/50"
  },
  {
    "task": "yolo_v8.load_dataset",
    "folder_path": "{{BASE_FOLDER}}/datasets/downloaded/cub200_parts-50"
  },
  {
    "task": "base.count_images"
  },
  {
    "task": "base.count_categories"
  },
  {
    "task": "categories.group_categories",
    "groups": {
      "upper": [["eye", "bill", "head", "nape", "throat"]],
      "lower": [["belly", "feet", "tail"]],
      "middle": [["Wing", "breast", "back"]],
      "Wing": [["Wing"]],
      "back": [["back"]],
      "belly": [["belly"]],
      "bill": [["bill"]],
      "eye": [["eye"]],
      "feet": [["feet"]],
      "head": [["head"]],
      "nape": [["nape"]],
      "tail": [["tail"]],
      "throat": [["throat"]]
    },
    "condition": "any"
  },
  {
    "task": "categories.keep",
    "categories": ["upper", "lower", "middle"]
  },
  {
    "task": "base.count_images"
  },
  {
    "task": "base.count_categories"
  },
  {
    "task": "base.show_categories"
  },
  {
    "task": "yolo_v8.write",
    "target_dir": "{{BASE_FOLDER}}/datasets/processed/birds-grouped-categories-cub200_parts-50"
  }
]

The workflow is composed of the following steps:

  1. Download the dataset from Roboflow
  2. Load the dataset
  3. Count the number of images
  4. Count the number of categories
  5. Group the categories, to create new categories based on the existing ones (upper, lower, middle)
  6. Keep only the categories that are in the groups "upper", "lower" and "middle"
  7. Count the number of images
  8. Count the number of categories
  9. Show the categories
  10. Write the dataset to disk

As we can see, we can use the {{BASE_FOLDER}} variable to refer to the base folder of the project. These variables are defined in the .env file, which is used to configure the project.

We can see the input images and output images below:

Example Bird 1:

Bird 1 before Bird 1 before

Example Bird 2:

Bird 2 before Bird 2 before

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pflows-0.1.23.tar.gz (5.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pflows-0.1.23-py3-none-any.whl (5.4 MB view details)

Uploaded Python 3

File details

Details for the file pflows-0.1.23.tar.gz.

File metadata

  • Download URL: pflows-0.1.23.tar.gz
  • Upload date:
  • Size: 5.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.4

File hashes

Hashes for pflows-0.1.23.tar.gz
Algorithm Hash digest
SHA256 4e297dea0588a97d3ddbd00109404bcf0e1ac3b61598b9497e9bbf6416e9b6e0
MD5 341c3b05e82f2909568d32ed0d67a95b
BLAKE2b-256 59d10e0b0b5e40fec8e5f785a3b33abe4f13de97a3eeb676c7803a123e47a94d

See more details on using hashes here.

File details

Details for the file pflows-0.1.23-py3-none-any.whl.

File metadata

  • Download URL: pflows-0.1.23-py3-none-any.whl
  • Upload date:
  • Size: 5.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.4

File hashes

Hashes for pflows-0.1.23-py3-none-any.whl
Algorithm Hash digest
SHA256 3ec43b1749d04dde024933ea38a17911eedb096631c23a69762a11e98a896d55
MD5 9dc55e31fbb07e46e865d9e04047d83f
BLAKE2b-256 3cb2cc260cf1535219b71ab31ad79a7465971e4f479b4ecd05064869911cb7a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page