Skip to main content

No project description provided

Project description

drawing

Auto DEap LEarning Computer Vision

Python library and dashboard for hyperparameter search and model training for computer vision tasks based on PyTorch, Optuna, FiftyOne, Dash, Segmentation Model Pytorch.

Generic badge Read the Docs GitHub Workflow Status (branch)

PyPI PyPI - Downloads

The main features of this library are:

  • Fiftyone dataset integration with prediction visualization
  • Uploading your dataset in one of the popular formats, currently supported - 2
  • Adding your own python class for convert dataset
  • Displaying training statistics in tensorboard
  • Support for all samples from optuna
  • Segmentation use smp: 9 model architectures, popular losses and metrics, see doc smp
  • Convert weights to another format, currently supported - 1 (onnx)

📚 Project Documentation 📚

Visit Read The Docs Project Page or read following README to know more about Auto Deap Learning Computer Vision (AdeleCV for short) library

📋 Table of content

  1. Examples
  2. Installation
  3. Instruction Dashboard
  4. Architecture
  5. Citing
  6. License

💡 Examples

  • Example api notebook
  • See video on the example of using dashboard

🛠 Installation

Install torch cuda if not installed:

$ pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu116

PyPI version:

$ pip install adelecv

Poetry:

$ poetry add adelecv

📜 Instruction Dashboard

  1. Create .env file.

See docs.

Notification_LEVEL: DEBUG | INFO | ERROR

Example:

TMP_PATH='./tmp'
DASHBOARD_PORT=8080
FIFTYONE_PORT=5151
TENSORBOARD_PORT=6006
NOTIFICATION_LEVEL=DEBUG
  1. Run (about 30 seconds (I'm working on acceleration)).
adelecv_dashboard --envfile .env
  1. Help
adelecv_dashboard --help

🏰 Architecture

architecture

The user can use the api or dashboard(web app). The api is based on 5 modules:

  • data: contains an internal representation of the dataset, classes for converting datasets, fiftyone dataset
  • _models: torch model, its hyperparams, functions for training
  • optimize: set of hyperparams, optuna optimizer
  • modification model: export and conversion of weights
  • logs: python logging

The Dash library was used for dashboard. It is based on components and callbacks on these component elements.

📝 Citing

@misc{Mamatin:2023,
  Author = {Denis Mamatin},
  Title = {AdeleCV},
  Year = {2023},
  Publisher = {GitHub},
  Journal = {GitHub repository},
  Howpublished = {\url{https://github.com/AsakoKabe/AdeleCV}}
}

🛡️ License

Project is distributed under MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adelecv-0.0.2.tar.gz (26.9 kB view details)

Uploaded Source

Built Distribution

adelecv-0.0.2-py3-none-any.whl (41.5 kB view details)

Uploaded Python 3

File details

Details for the file adelecv-0.0.2.tar.gz.

File metadata

  • Download URL: adelecv-0.0.2.tar.gz
  • Upload date:
  • Size: 26.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.10.9 Linux/5.15.0-1036-azure

File hashes

Hashes for adelecv-0.0.2.tar.gz
Algorithm Hash digest
SHA256 f4b06de3f960ea2823f81e0df5b97b273efb1a4195437ee5c7e6eb096081d4b3
MD5 6d0971b0dcc947aa14f48a6f1e841003
BLAKE2b-256 342b8eb2c70a1d880b3f98072fe01441bb76687b6388f293caf2478b848fadc6

See more details on using hashes here.

File details

Details for the file adelecv-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: adelecv-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 41.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.10.9 Linux/5.15.0-1036-azure

File hashes

Hashes for adelecv-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0e1956c74b1f4c5c0d15ff25d1149ad1d2f315a74d1543b6ef55f983904ab1e7
MD5 16a72e90b61f49559ed4cb81af0d999b
BLAKE2b-256 469bc84ff743cb331edb23bfaa9d03c9c118b8ada1d9d77a86500507a01defd8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page