Skip to main content

Several tools by dealing with image annotations to train YOLO or similar models

Project description

Pflow

Install

pip install -e '.[dev]'

Setup

cp .env.default .env

Run

pflows doc/examples/birds-grouped-categories.json

In this example, we defined a workflow to download the dataset from Roboflow as we can see in the image below:

Birds Dataset

The workflow is defined in a JSON file, where we define the steps to be executed:

[
  {
    "task": "roboflow_tools.download_dataset",
    "target_dir": "{{BASE_FOLDER}}/datasets/downloaded/cub200_parts-50",
    "url": "https://universe.roboflow.com/explainableai-lavbv/cub200_parts/dataset/50"
  },
  {
    "task": "yolo_v8.load_dataset",
    "folder_path": "{{BASE_FOLDER}}/datasets/downloaded/cub200_parts-50"
  },
  {
    "task": "base.count_images"
  },
  {
    "task": "base.count_categories"
  },
  {
    "task": "categories.group_categories",
    "groups": {
      "upper": [["eye", "bill", "head", "nape", "throat"]],
      "lower": [["belly", "feet", "tail"]],
      "middle": [["Wing", "breast", "back"]],
      "Wing": [["Wing"]],
      "back": [["back"]],
      "belly": [["belly"]],
      "bill": [["bill"]],
      "eye": [["eye"]],
      "feet": [["feet"]],
      "head": [["head"]],
      "nape": [["nape"]],
      "tail": [["tail"]],
      "throat": [["throat"]]
    },
    "condition": "any"
  },
  {
    "task": "categories.keep",
    "categories": ["upper", "lower", "middle"]
  },
  {
    "task": "base.count_images"
  },
  {
    "task": "base.count_categories"
  },
  {
    "task": "base.show_categories"
  },
  {
    "task": "yolo_v8.write",
    "target_dir": "{{BASE_FOLDER}}/datasets/processed/birds-grouped-categories-cub200_parts-50"
  }
]

The workflow is composed of the following steps:

  1. Download the dataset from Roboflow
  2. Load the dataset
  3. Count the number of images
  4. Count the number of categories
  5. Group the categories, to create new categories based on the existing ones (upper, lower, middle)
  6. Keep only the categories that are in the groups "upper", "lower" and "middle"
  7. Count the number of images
  8. Count the number of categories
  9. Show the categories
  10. Write the dataset to disk

As we can see, we can use the {{BASE_FOLDER}} variable to refer to the base folder of the project. These variables are defined in the .env file, which is used to configure the project.

We can see the input images and output images below:

Example Bird 1:

Bird 1 before Bird 1 before

Example Bird 2:

Bird 2 before Bird 2 before

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pflows-0.1.20.tar.gz (5.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pflows-0.1.20-py3-none-any.whl (5.4 MB view details)

Uploaded Python 3

File details

Details for the file pflows-0.1.20.tar.gz.

File metadata

  • Download URL: pflows-0.1.20.tar.gz
  • Upload date:
  • Size: 5.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.4

File hashes

Hashes for pflows-0.1.20.tar.gz
Algorithm Hash digest
SHA256 75f5334b84d12cd15724a502080c0fe31ee322ac1425c36eba7a6c819756b232
MD5 9b5e5e8fb53b0c38c4f28d702737de70
BLAKE2b-256 7960c2c4d5641cc12cf534f2e62a9ff4e41d2ee87f214d1999b4b9d6c5cef5ab

See more details on using hashes here.

File details

Details for the file pflows-0.1.20-py3-none-any.whl.

File metadata

  • Download URL: pflows-0.1.20-py3-none-any.whl
  • Upload date:
  • Size: 5.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.4

File hashes

Hashes for pflows-0.1.20-py3-none-any.whl
Algorithm Hash digest
SHA256 002894543f5161d2bd122c333d08c8f313ca1e69dac3055150df017c3b4777bf
MD5 e65aacd2abbb84e149ea18caa8506bd2
BLAKE2b-256 4908e2cb1a8c62ff3d9e9b06c9feb749249c6ab61e81a80b7b7e88ec139cc6fe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page