A lightweight binary image classification system built with scikit‑learn, focusing on k‑Nearest Neighbors (kNN) and classical ML models. It features dynamic, model‑aware image preprocessing (HOG, scaling, PCA) that adapts automatically to image characteristics, enabling efficient training and reliable inference for custom datasets.
Project description
DynamicML 🚀
Dynamic, model‑aware classical machine learning for image classification
CREATED BY SUNKARA SAI GANESH , KARROTHU MOURYA , KUDIRELLA SANMUKA SAI, DONTALA. KIRAN KUMAR.
DynamicML is a lightweight Python library built on top of scikit‑learn, Tensorflow and Pytorch that enables classification using classical machine learning models and advanced deep learning models, with a strong focus on dynamic, model‑aware preprocessing. The library is designed to automatically adapt preprocessing pipelines based on data characteristics and model requirements, making it easy to experiment, benchmark, and deploy classical ML solutions for image data. Typically in just 2-4 lines of code.
DynamicML – (Classical ML) Project Overview
This project implements classical machine learning models supported by DynamicML, built on top of scikit-learn.
Classical ML classifier (Single model at a time)
The goal is to classify images into two or more than two classes efficiently using lightweight ML models.
Supported Models (DynamicML)
DynamicML supports most classical classifiers from scikit-learn, including:
🔹 Linear Models
SGDClassifier
RidgeClassifier
🔹 Distance-Based Models
k-Nearest Neighbors (kNN)
Radius Neighbors
Nearest Centroid
🔹 Support Vector Machines
Linear SVM
Kernel SVM (RBF, Polynomial, Sigmoid)
🔹 Probabilistic Models
Gaussian Naive Bayes
Multinomial Naive Bayes
Complement Naive Bayes
Bernoulli Naive Bayes
🔹 Tree-Based Models
Decision Tree
Random Forest
Extra Trees
🔹 Boosting & Ensembles
AdaBoost
Gradient Boosting
HistGradientBoosting
Bagging
Voting
Stacking
🔹 Neural & Advanced Models
MLPClassifier
GaussianProcessClassifier
⚠️ Note: In this project, only one model is trained at a time for binary classification.
Dataset Structure dataset/ │ ├── class_0/ │ ├── img1.jpg │ ├── img2.jpg │ └── ... │ └── class_1/ | ├── img1.jpg | ├── img2.jpg | └── ... │ └── class_2/ ├── img1.jpg ├── img2.jpg └── ...
Each folder represents one class.
Preprocessing Pipeline
The project uses a classical ML pipeline for data processing:
📊 Train-Test Split train_test_split( test_size=0.2, stratify=y, random_state=42 )
80% Training
20% Testing
Stratified to maintain class balance
⚙️ Model Training
Example (Random Forest):
RandomForestClassifier( n_estimators=100, max_depth=None, random_state=42 )
Steps:
Load dataset
Train model
Evaluate performance
Save model using joblib
📈 Evaluation Metrics
The model is evaluated using:
✅ Accuracy
✅ Classification Report
✅ Confusion Matrix
Example output:
Accuracy: 0.91
Precision | Recall | F1-score ... 💾 Model Saving
Trained models are saved as:
modelname_timestamp.joblib
Saved bundle includes:
{ "model": trained_model, "label_encoder": label_encoder }
This ensures reproducibility and easy deployment.
🚀 How to Run Step 1: Install Requirements pip install scikit-learn numpy opencv-python scikit-image joblib Step 2: Run Training Scikit_Img_Classif_Supervised.RandomForest_Classifier("dataset_path") 🏗 System Architecture Dataset ↓ Preprocessing ↓ Classifier (Single Model) ↓ Evaluation ↓ Model Saving
🎯 Why Classical ML Instead of CNN?
Lightweight
Faster training
Works well on small datasets
Suitable for hackathons
Easier interpretability
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dynamicml-1.1.5.tar.gz.
File metadata
- Download URL: dynamicml-1.1.5.tar.gz
- Upload date:
- Size: 13.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
91e29e5d8e71e93f50c5754fc4bf41aa9ba43e1910c861d656b89b07713dff2b
|
|
| MD5 |
96ebbc38a39c38ed282d3e96f6f55e0a
|
|
| BLAKE2b-256 |
d0edce3e1a7f0779b127634e21cd2da66307fd414180a6e95bba60ebade2afa8
|
File details
Details for the file dynamicml-1.1.5-py3-none-any.whl.
File metadata
- Download URL: dynamicml-1.1.5-py3-none-any.whl
- Upload date:
- Size: 13.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d379f4bddbbcfef3e16ca01ebe2cbc1bca29cd7a9be8b4bf41de4111512b83ec
|
|
| MD5 |
169c1650182a0e2855f2c45d649de7b1
|
|
| BLAKE2b-256 |
0f155f10a8128a4d4dfbcc80275eeb9a0829b7d8994bc16cd09e69bcd5a2fb6d
|