Skip to main content

model to identify tv sizes using images

Project description

dlg-home-content

setup environment

  • conda env create -f environment.yml

  • install detectron2 from source

  • cpu version

> conda install pytorch torchvision cpuonly -c pytorch
> python -m pip install detectron2 -f \
  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.6/index.html
> update environment `conda env update --file environment.yml`
  • for other gpu versions, use this

CLI commands available

  • convert labelme2coco
labelme2coco --labelme_json_location 'data/processed_tv_annotations_v1/' --labels_loc "assets/keypoints.yml" --save_json "data/keypoints/" --train_ratio 0.9 --seed 50
  • train using custom dataset

We need to define three config files

  • base cfg file name available on detectron. check detectron/configs for examples.
  • cfg file which contains modified params . check configs folder for specific examples
  • data_cfg which has dataset and keypoints related params. For example assets/datasets.yml
# normal instance segmentation
custom_train --base_cfg 'COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml' --cfg 'configs/mask_only_exp1.yml' --data_cfg "assets/datasets.yml"

# instance segmentation with keypoints
custom_train --base_cfg 'COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml' --cfg 'configs/keypoint_mask_on_exp1.yml' --data_cfg "assets/datasets.yml"

Inference

LOGO Detection

Download latest inference file from here

from dlg_home_content.tv_detection import InferLogo
config = '../assets/e2e_infer.yml
model = InferLogo(config)
model.predict(img_loc, visualize=True)

Inference for Keypoint Detetion

Download weight files and config files from [here] (https://fractalanalytic-my.sharepoint.com/:u:/g/personal/sindhura_k_fractal_ai/EXCaFSHWv3hMo99lvfP4zKIBLBO8dlnWzY7iUAFWYiXHKA?e=23XheZ)

#for inner keyoint detection
from dlg_home_content.inference_pipeline import KeypointInference

config = '../assets/e2e_infer.yml'
#kp_type in ['kp_inner_edge','kp_outer_edge','kp_sticky_note']
model_inner = KeypointInference(config, kp_type='kp_inner_edge')
predicted_keyoints = model_inner.predict_keypoints(img_loc, visualize=True)

End-to-End Inference pipeline

from dlg_home_content.e2e_inference import E2EInference
config = '../assets/e2e_infer.yml'
final_pipeline = E2EInference(config)
result = final_pipeline.infer(img_loc, 8, 8, True)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dlg_home_content-0.0.8.tar.gz (32.8 kB view details)

Uploaded Source

Built Distribution

dlg_home_content-0.0.8-py3-none-any.whl (36.5 kB view details)

Uploaded Python 3

File details

Details for the file dlg_home_content-0.0.8.tar.gz.

File metadata

  • Download URL: dlg_home_content-0.0.8.tar.gz
  • Upload date:
  • Size: 32.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.3.1.post20200810 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.7.0

File hashes

Hashes for dlg_home_content-0.0.8.tar.gz
Algorithm Hash digest
SHA256 c67c6ce90f9597b4e38a918e9216c8a5c0066d00ea4c18e866a0f1562f31ada4
MD5 f9b5aea607031393bfa0aa20ce8888db
BLAKE2b-256 a659e120babe25e38feac0fa7b914ff7951c253fb737f3439f648277842360de

See more details on using hashes here.

File details

Details for the file dlg_home_content-0.0.8-py3-none-any.whl.

File metadata

  • Download URL: dlg_home_content-0.0.8-py3-none-any.whl
  • Upload date:
  • Size: 36.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.3.1.post20200810 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.7.0

File hashes

Hashes for dlg_home_content-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 3804d66350f27b9cd51dba23a5f038d46925e1622196297c0b2d2edc06adf1b2
MD5 ac56f93d0df39ff933325f27226b574b
BLAKE2b-256 70940ca210beed4b3baa38bf3879010a74bffc113adb2eab0a959eed1c280440

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page