Skip to main content

All you need to prepare and preprocess your annotated images

Project description

PreImutils

PreImutils

License: MIT made-with-python Made in MRL

PreImutils is a python library built to empower developers, reseachers and students to prepare and preprocessing image datasets for applications and systems with Deep Learning and Computer Vision capabilities using simple and few lines of code. This documentation is provided to provide detailed insight into all the classes and functions available in PreImutils, coupled with a number of code examples.

The Official GitHub Repository of PreImutils is https://github.com/mrl-amrl/preimutils

For read the full doc please visit
https://a-sharifi.github.io/preimutils-doc/

Easy to use:

Why we need PreImutils?

Everything that you need to preprocess your image dataset is here. One of the most important item for machine learning or CNN or other neural networks is preparing your dataset.

- It's easy to use.
- You can use both in terminal and code.
- It has separate classes for object-detection and segmentation datasets.

For Object detection

Use in Code

from preimutils.object_detection import AMRLImageAug

img_aug = AMRLImageAug(json_path, xmls_dir, images_dir)
img_aug.auto_augmentation(quantity, resized = True, width = 300, height = 300)

Use in Terminal

JSON_PATH=~/YOUR_JSON_PATH/label.json
XMLS_DIR=~/YOUR_ANNOTATION_DIR/
IMAGES_DIR=~/YOUR_IMAGES_DIR/
FUNCTION=auto_augmentation
QUANTITY=1000 # the amount of each object to create

preimutils --function $FUNCTION --label_json_path $JSON_PATH --xmls_dir $XMLS_DIR --images_dir $IMAGES_DIR --quantity $QUANTITY

For segmentation task

from preimutils.segmentations.voc import Dataset
from preimutils.segmentations.voc import SegmentationAug


dataset = Dataset('./VOC2012', images_extention='jpg')

# First checking to have valid dataset(All dirs exist all mask's image exist or no)
dataset.check_valid_dataset()
seg = SegmentationAug(dataset.label_map_path,dataset.masks_dir,dataset.images_dir, images_extention='jpg')

# At the end, there must be 2000 of each object
seg.auto_augmentation(2000)
# Seprate dataset into validation 10% test 10% and train 80% and save it to train.txt,trainval.txt ,val.txt,test.txt
dataset.seprate_dataset(shuffle=True,valid_persent=0.10,test_persent=0.10,save=True)

Some points

1.  The amount of your dataset is really important. Not very few that lose the accuracy not great number of that lose your time and cause to overfitting, more than 4000 image per object is enough that mostly. depend on how much your feature is hard.
2.  The amount of each object image is important if objects sample count not equal your neural network forget the lower object count for instance if you have 3 object each one should have 4000 sample.


3. Don't forget to shuffle your dataset if you don't do that you never ever don't get good accuracy on all of your objects.
4. If you want to detect your object from all angles don't forget to put sample from other angle

attention

  • No:

    object Sample Count
    object 1 2000
    object 2 1000
    object 3 4000
  • Yes:

    object Sample Count
    object 1 3900
    object 2 4100
    object 3 4000

PreImutils help you to do these points in few line of code.

How should I use Preimutils?

We prepare a full documentation in https://a-sharifi.github.io/preimutils-doc/ .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

preimutils-1.1.0.tar.gz (22.1 kB view details)

Uploaded Source

Built Distribution

preimutils-1.1.0-py3-none-any.whl (27.9 kB view details)

Uploaded Python 3

File details

Details for the file preimutils-1.1.0.tar.gz.

File metadata

  • Download URL: preimutils-1.1.0.tar.gz
  • Upload date:
  • Size: 22.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.10

File hashes

Hashes for preimutils-1.1.0.tar.gz
Algorithm Hash digest
SHA256 8cfe11c9b1e2399bd2ceaa28d74a80a1810917956cac9b59eee0a40f7b3b86b7
MD5 7929e30159dbc5fa8f4b8de4e633c439
BLAKE2b-256 dd92d3da5a63bab1f9f49c945967e2be851b8e70bd881dab28e7cf718fad7628

See more details on using hashes here.

File details

Details for the file preimutils-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: preimutils-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 27.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.10

File hashes

Hashes for preimutils-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 93f036373ec65e0ebd3ffb50b5855794a76fca6d7eca3d8e919f4f2bf4f172be
MD5 a68d4b296d3ad2abda4189450ffdc8f5
BLAKE2b-256 1c0a122cef377892a344445a18b675b30c39103c63c3d62fe9a83ddaabd4a900

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page