Rock classifier deployed on railway and monitored using Weights and Biases!
Project description
Whats-this-rock
This project deploys a telegram bot that classifies rock images into 1
of 7 types.
Installation & Training Steps
Use the Telegram Bot
You can try the bot here on Telegram.
Type /help
to get instructions.
Deploy Telegram Bot
pip install -r requirements-prod.txt
python rock_classifier/bot.py
Train Model
Paste your kaggle.json file in the root directory
Run these commands
pip install -r requirements-dev.txt
sh src/scripts/setup.sh
python src/models/train.py
You can try different models and parameters by editing config.json
.
By using Hydra it’s now much more easier to override parameters like this
python src/models/train.py wandb.project=Whats-this-rockv \
dataset_id=[1,2,3,4] \
epochs=50 \
backbone=resnet
Wandb Sweeps (Hyperparameter Tuning)
Edit configs/sweeps.yaml
wandb sweep \
--project Whats-this-rock \
--entity udaylunawat \
configs/sweep.yaml
This will return a command with $sweepid
wandb agent udaylunawat/Whats-this-rock/$sweepid
Demo
Run in Colab | View Source on GitHub | Download Notebook |
Features
\Features added | \Features planned |
|
Deploy to Huggingface spaces Accessing the model through FastAPI (Backend) Streamlit (Frontend) convert models.py to Classes and more OOP style nbdev Group Runs
find the long tail examples or hard examples, find the classes that the model is performing terribly on, Add Badges
|
Technologies Used
Directory Tree
├── imgs <- Images for skill banner, project banner and other images
│
├── configs <- Configuration files
│ ├── configs.yaml <- config for single run
│ └── sweeps.yaml <- confguration file for sweeps hyperparameter tuning
│
├── data
│ ├── corrupted_images <- corrupted images will be moved to this directory
│ ├── sample_images <- Sample images for inference
│ ├── 0_raw <- The original, immutable data dump.
│ ├── 1_external <- Data from third party sources.
│ ├── 2_interim <- Intermediate data that has been transformed.
│ └── 3_processed <- The final, canonical data sets for modeling.
│
├── notebooks <- Jupyter notebooks. Naming convention is a number (for ordering),
│ the creator's initials, and a short `-` delimited description, e.g.
│ 1.0-jqp-initial-data-exploration`.
│
│
├── src <- Source code for use in this project.
│ │
│ ├── data <- Scripts to download or generate data
│ │ ├── download.py
│ │ ├── preprocess.py
│ │ └── utils.py
│ │
│ ├── callbacks <- functions that are executed during training at given stages of the training procedure
│ │ ├── custom_callbacks.py
│ │ └── callbacks.py
│ │
│ ├── models <- Scripts to train models and then use trained models to make
│ │ │ predictions
│ │ ├── evaluate.py
│ │ ├── models.py
│ │ ├── predict.py
│ │ ├── train.py
│ │ └── utils.py
│ │
│ └── scripts <- Scripts to setup dir structure and download datasets
│ │ ├── clean_dir.sh
│ │ ├── dataset1.sh
│ │ ├── dataset2.sh
│ │ ├── dataset3.sh
│ │ ├── dataset4.sh
│ │ └── setup.sh
│. │
│ └── visualization <- Scripts for visualizations
│
├── .dockerignore <- Docker ignore
├── .gitignore <- GitHub's excellent Python .gitignore customized for this project
├── LICENSE <- Your project's license.
├── Makefile <- Makefile with commands like `make data` or `make train`
├── README.md <- The top-level README for developers using this project.
├── requirements.txt <- The requirements file for reproducing the analysis environment, e.g.
│ generated with `pip freeze > requirements.txt`
└── setup.py <- makes project pip installable (pip install -e .) so src can be imported
Bug / Feature Request
If you find a bug (the site couldn’t handle the query and / or gave undesired results), kindly open an issue here by including your search query and the expected result.
If you’d like to request a new function, feel free to do so by opening an issue here. Please include sample queries and their corresponding results.
Contributing
- Contributions make the open source community such an amazing place to learn, inspire, and create.
- Any contributions you make are greatly appreciated.
- Check out our contribution guidelines for more information.
License
LinkFree is licensed under the MIT License - see the LICENSE file for details.
Credits
Support
This project needs a ⭐️ from you. Don’t forget to leave a star ⭐️
Walt might be the one who knocks
but Hank is the one who rocks.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file rocks_classifier-0.0.6.tar.gz
.
File metadata
- Download URL: rocks_classifier-0.0.6.tar.gz
- Upload date:
- Size: 30.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8d891ad78836e81d53bd9cd3c9d509eaf29d812f3a82c3971da147b2a82bcf6f |
|
MD5 | 30755454a37f3c07f9121ac73595afc5 |
|
BLAKE2b-256 | 84adade1baab9be21a353145dd851dafc3db4db379e3aa1ebc63646259fa75fd |
File details
Details for the file rocks_classifier-0.0.6-py3-none-any.whl
.
File metadata
- Download URL: rocks_classifier-0.0.6-py3-none-any.whl
- Upload date:
- Size: 31.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d156c774253468c130d0d1a77872320b5c449f53c6a99cf2be74a6fcccaa5259 |
|
MD5 | fe4929aa9b5a4fb172a90ab3a1381945 |
|
BLAKE2b-256 | 1b55b7b784dc75c8796f32fe02124955c732a5cc2e778491e91b14d92192e6b1 |