Hyper-parameter tuning for the masses!
Project description
letstune
Hyper-parameter tuning for the masses!
Machine Learning algorithms have many parameters, which are expected to be chosen by a user - like the number of layers or learning rate.
It requires a lot of trial and error.
letstune automatically tries various parameter configurations and gives you back the best model.
How it differs from GridSearchCV?
letstune will give you a better model in a shorter time in comparison to the classical hyperparameter tuning algorithms.
- Generate random parameters
- Evaluate each parameter with a small time budget
- Drop low-performers automatically, only good-performers will stay in the pool
The 3rd point is the distinguishing feature of letstune - other algorithms dutifully train weak models - without a good reason.
Ergonomics
Common tasks in letstune are realized with Python one-liners:
The best model:
model = tuning[0].best_epoch.checkpoint.load_pickle()
Pandas summary dataframe with parameters and metric values:
df = tuning.to_df()
Great! How to use it?
Install with pip:
pip install letstune
First, define your parameters:
import letstune
from letstune import Params, rand
class SGDClassifierParams(Params):
model_cls = SGDClassifier
average: bool
l1_ratio: float = rand.uniform(0, 1)
alpha: float = rand.uniform(1e-2, 1e0, log=True)
Then define a trainer. Trainer is an object, which knows how to train a model!
class DigitsTrainer(letstune.SimpleTrainer):
params_cls = SGDClassifierParams
metric = "accuracy"
def load_dataset(self, dataset):
self.X_train, self.X_test, self.y_train, self.y_test = dataset
def train(self, params):
# params has type SGDClassifierParams
# letstune provides method create_model
# returning SGDClassifier
model = params.create_model(
loss="hinge",
penalty="elasticnet",
fit_intercept=True,
random_state=42,
)
model.fit(self.X_train, self.y_train)
accuracy = model.score(self.X_test, self.y_test)
return model, {"accuracy": accuracy}
trainer = DigitsTrainer() # new instance!
Neural networks and gradient boosting trainings
can be based on letstune.EpochTrainer,
which has train_epoch method.
Finally, let's tune!
tuning = letstune.tune(
trainer,
16, # number of tested random parameters
dataset=(X_train, X_test, y_train, y_test),
results_dir="digits_tuning",
)
Our model is ready to use:
model = tuning[0].checkpoint.load_pickle()
Don't forget to check out examples directory! 👀
Documentation is here!
Additionally
Works with your favourite ML library 🐍 - it's library agnostic!
Resumes work from the point, where program was stopped.
Permissive business-friendly MIT license.
References
A System for Massively Parallel Hyperparameter Tuning by Li et al.; arXiv:1810.05934
Overview of various hyperparameter-tuning algorithms. letstune implements a variant of Successive Halving.
Contributing
Issues are tracked on GitHub.
Changelog
Please see CHANGELOG.md.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file letstune-0.3.0.tar.gz.
File metadata
- Download URL: letstune-0.3.0.tar.gz
- Upload date:
- Size: 23.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.0 CPython/3.12.0 Darwin/23.1.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f5671f076f3df249dfa9d653443d265ef9a07d6563b84bfc1c0b3c263941d408
|
|
| MD5 |
585eebce28c3ed808db2485b16947217
|
|
| BLAKE2b-256 |
f4f7651eb8f373c059dc80ee3b577f1ac09cacdf6f6d8aefb92ec59830908ea0
|
File details
Details for the file letstune-0.3.0-py3-none-any.whl.
File metadata
- Download URL: letstune-0.3.0-py3-none-any.whl
- Upload date:
- Size: 30.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.0 CPython/3.12.0 Darwin/23.1.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e5ac649f867e9d99f3aad97b9ee862f0167f2e76e6fae5bfc745e4ab7bc806b2
|
|
| MD5 |
58f3c8231fb2cb2301158c77f6bb0760
|
|
| BLAKE2b-256 |
84e3e2059bcf766ff7976cdb6a11bf4e2c971014b68c2bc98f5dc63eb0def524
|