Rock classifier deployed on railway and monitored using Weights and Biases!
Project description
Whats-this-rock
This project deploys a telegram bot that classifies rock images into 1
of 7 types.
Installation & Training Steps
Use the Telegram Bot
You can try the bot here on Telegram.
Type /help
to get instructions.
Deploy Telegram Bot
pip install -r requirements-prod.txt
python rock_classifier/bot.py
Train Model
Paste your kaggle.json file in the root directory
Run these commands
pip install -r requirements-dev.txt
sh src/scripts/setup.sh
python src/models/train.py
You can try different models and parameters by editing config.json
.
By using Hydra it’s now much more easier to override parameters like this
python src/models/train.py wandb.project=Whats-this-rockv \
dataset_id=[1,2,3,4] \
epochs=50 \
backbone=resnet
Wandb Sweeps (Hyperparameter Tuning)
Edit configs/sweeps.yaml
wandb sweep \
--project Whats-this-rock \
--entity udaylunawat \
configs/sweep.yaml
This will return a command with $sweepid
wandb agent udaylunawat/Whats-this-rock/$sweepid
Demo
Run in Colab | View Source on GitHub | Download Notebook |
Features
\Features added | \Features planned |
|
Deploy to Huggingface spaces Accessing the model through FastAPI (Backend) Streamlit (Frontend) convert models.py to Classes and more OOP style nbdev Group Runs
find the long tail examples or hard examples, find the classes that the model is performing terribly on, Add Badges
|
Technologies Used
Directory Tree
├── imgs <- Images for skill banner, project banner and other images
│
├── configs <- Configuration files
│ ├── configs.yaml <- config for single run
│ └── sweeps.yaml <- confguration file for sweeps hyperparameter tuning
│
├── data
│ ├── corrupted_images <- corrupted images will be moved to this directory
│ ├── sample_images <- Sample images for inference
│ ├── 0_raw <- The original, immutable data dump.
│ ├── 1_external <- Data from third party sources.
│ ├── 2_interim <- Intermediate data that has been transformed.
│ └── 3_processed <- The final, canonical data sets for modeling.
│
├── notebooks <- Jupyter notebooks. Naming convention is a number (for ordering),
│ the creator's initials, and a short `-` delimited description, e.g.
│ 1.0-jqp-initial-data-exploration`.
│
│
├── src <- Source code for use in this project.
│ │
│ ├── data <- Scripts to download or generate data
│ │ ├── download.py
│ │ ├── preprocess.py
│ │ └── utils.py
│ │
│ ├── callbacks <- functions that are executed during training at given stages of the training procedure
│ │ ├── custom_callbacks.py
│ │ └── callbacks.py
│ │
│ ├── models <- Scripts to train models and then use trained models to make
│ │ │ predictions
│ │ ├── evaluate.py
│ │ ├── models.py
│ │ ├── predict.py
│ │ ├── train.py
│ │ └── utils.py
│ │
│ └── scripts <- Scripts to setup dir structure and download datasets
│ │ ├── clean_dir.sh
│ │ ├── dataset1.sh
│ │ ├── dataset2.sh
│ │ ├── dataset3.sh
│ │ ├── dataset4.sh
│ │ └── setup.sh
│. │
│ └── visualization <- Scripts for visualizations
│
├── .dockerignore <- Docker ignore
├── .gitignore <- GitHub's excellent Python .gitignore customized for this project
├── LICENSE <- Your project's license.
├── Makefile <- Makefile with commands like `make data` or `make train`
├── README.md <- The top-level README for developers using this project.
├── requirements.txt <- The requirements file for reproducing the analysis environment, e.g.
│ generated with `pip freeze > requirements.txt`
└── setup.py <- makes project pip installable (pip install -e .) so src can be imported
Bug / Feature Request
If you find a bug (the site couldn’t handle the query and / or gave undesired results), kindly open an issue here by including your search query and the expected result.
If you’d like to request a new function, feel free to do so by opening an issue here. Please include sample queries and their corresponding results.
Contributing
- Contributions make the open source community such an amazing place to learn, inspire, and create.
- Any contributions you make are greatly appreciated.
- Check out our contribution guidelines for more information.
License
LinkFree is licensed under the MIT License - see the LICENSE file for details.
Credits
Support
This project needs a ⭐️ from you. Don’t forget to leave a star ⭐️
Walt might be the one who knocks
but Hank is the one who rocks.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file rocks_classifier-0.0.4.tar.gz
.
File metadata
- Download URL: rocks_classifier-0.0.4.tar.gz
- Upload date:
- Size: 29.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 93fa115e5ad09bec3fff27248b6c0ee90af14fc838ce856218edca2d9895b68c |
|
MD5 | 466062c5f26e0342872e75cba2a2ba60 |
|
BLAKE2b-256 | 0ad592e235022be5684e8c0e852eda3bf972ae02eb16a5234a74fcb8207c11b0 |
File details
Details for the file rocks_classifier-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: rocks_classifier-0.0.4-py3-none-any.whl
- Upload date:
- Size: 29.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | eb4402965b9e1a94819aa4f90fea9efbcae433333d518ce5f7f815c982a751fb |
|
MD5 | acbf7ccd9d3e162567c4ecccb16028fe |
|
BLAKE2b-256 | 31948df4e6b9b24e631f311df489543ef3303270bcc1009be39dc63daaadac17 |