Skip to main content

Fast hyperparameter optimization using screening and pruning

Project description

LazyTune 🚀

LazyTune

Fast, smart, and lazy hyperparameter optimization
for scikit-learn models — up to 5–10× faster than GridSearchCV
with almost the same final performance.

Data Flow


PyPI version Python versions License: MIT

Live Demohttps://lazytune.vercel.app
PyPIhttps://pypi.org/project/lazytune/


🔥 Why LazyTune?

  • 100% compatible with any scikit-learn-style estimator
  • Works for classification & regression
  • Supports all scikit-learn metrics + custom scorers
  • Smart pipeline: screening → ranking → pruning → full training
  • Early pruning of poor configurations (prune_ratio)
  • Parallel execution (n_jobs)
  • Clean trial summaries & rankings in pandas DataFrame
  • Returns best model, params, score + detailed report

Installation

pip install lazytune

Quick Start

from sklearn.datasets import load_breast_cancer
from sklearn.ensemble import RandomForestClassifier
from lazytune import SmartSearch

X, y = load_breast_cancer(return_X_y=True)

param_grid = {
    "n_estimators": [50, 100, 150, 200],
    "max_depth": [5, 10, 15, None],
    "min_samples_split": [2, 3, 4, 5]
}

search = SmartSearch(
    estimator=RandomForestClassifier(random_state=42),
    param_grid=param_grid,
    metric="accuracy",
    cv_folds=3,
    prune_ratio=0.5,       # keep top 50% after screening
    n_jobs=-1              # use all cores
)

search.fit(X, y)

print("Best parameters:", search.best_params_)
print("Best CV score:   ", search.best_score_)
print("\nBest model:\n", search.best_estimator_)

More Examples

SVM Classification

from sklearn.svm import SVC
from lazytune import SmartSearch

search = SmartSearch(
    estimator=SVC(random_state=42),
    param_grid={
        "C": [0.1, 1, 10, 50, 100],
        "kernel": ["linear", "rbf"],
        "gamma": ["scale", "auto", 0.001, 0.0001]
    },
    metric="f1_macro",
    cv_folds=5,
    prune_ratio=0.6
)

Regression (Random Forest)

from sklearn.ensemble import RandomForestRegressor
from lazytune import SmartSearch

search = SmartSearch(
    estimator=RandomForestRegressor(random_state=42),
    param_grid={
        "n_estimators": [100, 200, 300, 500],
        "max_depth": [8, 12, 16, None],
        "min_samples_split": [2, 4, 8]
    },
    metric="r2",
    cv_folds=4,
    n_jobs=-1
)

Supported Metrics (examples)

Classification

accuracyf1f1_macrof1_weightedprecisionrecallroc_aucbalanced_accuracy • ...

Regression

r2neg_mean_squared_errorneg_root_mean_squared_errorneg_mean_absolute_error • ...

Custom metrics → use sklearn.metrics.make_scorer


How LazyTune Works

  1. Generate all (or sampled) hyperparameter combinations
  2. Quick screening round with cross-validation
  3. Rank configurations by performance
  4. Prune bottom performers (prune_ratio)
  5. Train remaining promising candidates thoroughly
  6. Return best model + full trial summary

Much faster than GridSearchCV / RandomizedSearchCV
→ Usually very close (or identical) final performance


Main API – SmartSearch

Key Attributes

Attribute Description
best_params_ Best hyperparameter dictionary
best_score_ Best cross-validated score
best_estimator_ Fully fitted estimator with best parameters
summary_ pandas DataFrame with trial results & rankings
cv_results_ Detailed CV results per candidate

Main Methods

  • .fit(X, y)
  • .predict(X)
  • .score(X, y)
  • .get_params() / .set_params()

Requirements

  • Python ≥ 3.8
  • numpy
  • pandas
  • scikit-learn

Made with ❤️ by Anik Chand
MIT License

Feedback, issues, stars, and contributions are very welcome! 🌟

Try Live DemoInstall from PyPI

Happy tuning! 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lazytune-0.1.3.tar.gz (10.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lazytune-0.1.3-py3-none-any.whl (9.9 kB view details)

Uploaded Python 3

File details

Details for the file lazytune-0.1.3.tar.gz.

File metadata

  • Download URL: lazytune-0.1.3.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.6

File hashes

Hashes for lazytune-0.1.3.tar.gz
Algorithm Hash digest
SHA256 32e51198eec96404f1f19a6ef5e72c0c0c34e841bb8013e8371a3ce1f8ad1f8a
MD5 24f0abbbd3d385ab2d2d5182f23300d2
BLAKE2b-256 c7adca9bbd59a176a825cf189228f653520a718d5504d8a34e773462e466f123

See more details on using hashes here.

File details

Details for the file lazytune-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: lazytune-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 9.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.6

File hashes

Hashes for lazytune-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ebfc92bb2ec16a80d95b9b40ab68dd74797e77dce2ba27c23e58a0bc61ab897b
MD5 31a403a4fb99f4f8f8eca04725ba9f3a
BLAKE2b-256 bf76a6ebab41782ebedd358fbec6639d67bf63fbf40b1d26510ad713d3a65381

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page