Fast hyperparameter optimization using screening and pruning
Project description
LazyTune
LazyTune is a fast and efficient hyperparameter optimization framework for machine learning models.
It dramatically reduces unnecessary training time by using a smart screening → ranking → pruning → full training pipeline — while staying 100% compatible with scikit-learn estimators.
Supports classification & regression, all scikit-learn metrics, custom scorers, cross-validation screening, early pruning of poor configurations, parallel execution, and clean result reporting.
Features
- Compatible with any scikit-learn-style estimator
- Works for both classification and regression
- Supports all scikit-learn built-in metrics (
accuracy,f1,r2,neg_mean_squared_error, etc.) - Allows custom scoring functions via
make_scorer - Fast initial screening with cross-validation
- Early pruning of weak hyperparameter settings (
prune_ratio) - Parallel execution support (
n_jobs) - Structured trial summaries and ranking
- Returns best model, parameters, score + detailed report
Installation
pip install lazytune
Quick Start Example
from sklearn.datasets import load_breast_cancer
from sklearn.ensemble import RandomForestClassifier
from lazytune import SmartSearch
X, y = load_breast_cancer(return_X_y=True)
param_grid = {
"n_estimators": [50, 100, 150, 200],
"max_depth": [5, 10, 15, None],
"min_samples_split": [2, 3, 4, 5]
}
search = SmartSearch(
estimator=RandomForestClassifier(random_state=42),
param_grid=param_grid,
metric="accuracy",
cv_folds=3,
prune_ratio=0.5, # keep top 50% after screening
n_jobs=-1 # use all available cores
)
search.fit(X, y)
print("Best parameters:", search.best_params_)
print("Best CV score: ", search.best_score_)
print("\nBest model:\n", search.best_estimator_)
More Examples
SVM Classification
from sklearn.svm import SVC
from lazytune import SmartSearch
search = SmartSearch(
estimator=SVC(random_state=42),
param_grid={
"C": [0.1, 1, 10, 50, 100],
"kernel": ["linear", "rbf"],
"gamma": ["scale", "auto", 0.001, 0.0001]
},
metric="f1_macro",
cv_folds=5,
prune_ratio=0.6
)
Regression (Random Forest)
from sklearn.ensemble import RandomForestRegressor
from lazytune import SmartSearch
search = SmartSearch(
estimator=RandomForestRegressor(random_state=42),
param_grid={
"n_estimators": [100, 200, 300, 500],
"max_depth": [8, 12, 16, None],
"min_samples_split": [2, 4, 8]
},
metric="r2",
cv_folds=4,
n_jobs=-1
)
Supported Metrics (examples)
Classification
accuracy • f1 • f1_macro • f1_weighted • precision • recall • roc_auc • balanced_accuracy • ...
Regression
r2 • neg_mean_squared_error • neg_root_mean_squared_error • neg_mean_absolute_error • neg_mean_absolute_percentage_error • ...
Custom metrics → use sklearn.metrics.make_scorer
How It Works (LazyTune Strategy)
- Generate all (or sampled) hyperparameter combinations
- Quick screening round with cross-validation (low resources)
- Rank configurations by performance
- Prune bottom performers (controlled by
prune_ratio) - Train remaining promising candidates more thoroughly
- Return best model + full summary of all evaluated trials
→ Much faster than full GridSearchCV while usually keeping very similar final performance.
Main API – SmartSearch
Key Attributes
| Attribute | Description |
|---|---|
best_params_ |
Best found hyperparameter dictionary |
best_score_ |
Best cross-validated score |
best_estimator_ |
Fully fitted estimator with best parameters |
summary_ |
pandas DataFrame with trial results & rankings |
cv_results_ |
Detailed cross-validation results per candidate |
Main Methods
.fit(X, y).predict(X).score(X, y).get_params()/.set_params()
Requirements
- Python ≥ 3.8
- numpy
- pandas
- scikit-learn
Author
Anik Chand
License
MIT License
Feedback, issues, stars, and contributions are very welcome!
Happy tuning! 🚀
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lazytune-0.1.2.tar.gz.
File metadata
- Download URL: lazytune-0.1.2.tar.gz
- Upload date:
- Size: 10.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f711d7ffecf9b64d590aede8de39fdf3537c45e789d9ddafd97a62d139097455
|
|
| MD5 |
0a396e453f0e168f430dab5415243369
|
|
| BLAKE2b-256 |
e25f7cbf98aa949bca2ba1822567e69f17c6f475e38dffdc9d63e43ac048e290
|
File details
Details for the file lazytune-0.1.2-py3-none-any.whl.
File metadata
- Download URL: lazytune-0.1.2-py3-none-any.whl
- Upload date:
- Size: 9.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6513c963fceea9c309908fbb062ea45984a05d5b588510c08f9da736e9779ec1
|
|
| MD5 |
3945ac8ff16d1dd4332a385751b5ca4a
|
|
| BLAKE2b-256 |
96ef06b5954dfe1eb2c22a9ecbe87d69fb4790ce456a6a10560b3cd41fa737a1
|