autolgbm: tuning lightgbm with optuna
Project description
AutoLGBM
LightGBM + Optuna: no brainer
- auto train lightgbm directly from CSV files
- auto tune lightgbm using optuna
- auto serve best lightgbm model using fastapi
NOTE: PRs are currently
- not accepted. If there are issues/problems, please create an issue.
- accepted. If there are issues/problems, please solve with a PR.
Inspired by Abhishek Thakur's AutoXGB.
Installation
Install using pip
pip install autolgbm
Usage
Training a model using AutoLGBM is a piece of cake. All you need is some tabular data.
Parameters
###############################################################################
### required parameters
###############################################################################
# path to training data
train_filename = "data_samples/binary_classification.csv"
# path to output folder to store artifacts
output = "output"
###############################################################################
### optional parameters
###############################################################################
# path to test data. if specified, the model will be evaluated on the test data
# and test_predictions.csv will be saved to the output folder
# if not specified, only OOF predictions will be saved
# test_filename = "test.csv"
test_filename = None
# task: classification or regression
# if not specified, the task will be inferred automatically
# task = "classification"
# task = "regression"
task = None
# an id column
# if not specified, the id column will be generated automatically with the name `id`
# idx = "id"
idx = None
# target columns are list of strings
# if not specified, the target column be assumed to be named `target`
# and the problem will be treated as one of: binary classification, multiclass classification,
# or single column regression
# targets = ["target"]
# targets = ["target1", "target2"]
targets = ["income"]
# features columns are list of strings
# if not specified, all columns except `id`, `targets` & `kfold` columns will be used
# features = ["col1", "col2"]
features = None
# categorical_features are list of strings
# if not specified, categorical columns will be inferred automatically
# categorical_features = ["col1", "col2"]
categorical_features = None
# use_gpu is boolean
# if not specified, GPU is not used
# use_gpu = True
# use_gpu = False
use_gpu = True
# number of folds to use for cross-validation
# default is 5
num_folds = 5
# random seed for reproducibility
# default is 42
seed = 42
# number of optuna trials to run
# default is 1000
# num_trials = 1000
num_trials = 100
# time_limit for optuna trials in seconds
# if not specified, timeout is not set and all trials are run
# time_limit = None
time_limit = 360
# if fast is set to True, the hyperparameter tuning will use only one fold
# however, the model will be trained on all folds in the end
# to generate OOF predictions and test predictions
# default is False
# fast = False
fast = False
Python API
To train a new model, you can run:
from autolgbm import AutoLGBM
# required parameters:
train_filename = "data_samples/binary_classification.csv"
output = "output"
# optional parameters
test_filename = None
task = None
idx = None
targets = ["income"]
features = None
categorical_features = None
use_gpu = True
num_folds = 5
seed = 42
num_trials = 100
time_limit = 360
fast = False
# Now its time to train the model!
algbm = AutoLGBM(
train_filename=train_filename,
output=output,
test_filename=test_filename,
task=task,
idx=idx,
targets=targets,
features=features,
categorical_features=categorical_features,
use_gpu=use_gpu,
num_folds=num_folds,
seed=seed,
num_trials=num_trials,
time_limit=time_limit,
fast=fast,
)
algbm.train()
CLI
Train the model using the autolgbm train
command. The parameters are same as above.
autolgbm train \
--train_filename datasets/30train.csv \
--output outputs/30days \
--test_filename datasets/30test.csv \
--use_gpu
You can also serve the trained model using the autolgbm serve
command.
autolgbm serve --model_path outputs/mll --host 0.0.0.0 --debug
To know more about a command, run:
`autolgbm <command> --help`
autolgbm train --help
usage: autolgbm <command> [<args>] train [-h] --train_filename TRAIN_FILENAME [--test_filename TEST_FILENAME] --output
OUTPUT [--task {classification,regression}] [--idx IDX] [--targets TARGETS]
[--num_folds NUM_FOLDS] [--features FEATURES] [--use_gpu] [--fast]
[--seed SEED] [--time_limit TIME_LIMIT]
optional arguments:
-h, --help show this help message and exit
--train_filename TRAIN_FILENAME
Path to training file
--test_filename TEST_FILENAME
Path to test file
--output OUTPUT Path to output directory
--task {classification,regression}
User defined task type
--idx IDX ID column
--targets TARGETS Target column(s). If there are multiple targets, separate by ';'
--num_folds NUM_FOLDS
Number of folds to use
--features FEATURES Features to use, separated by ';'
--use_gpu Whether to use GPU for training
--fast Whether to use fast mode for tuning params. Only one fold will be used if fast mode is set
--seed SEED Random seed
--time_limit TIME_LIMIT
Time limit for optimization
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
autolgbm-0.0.3.tar.gz
(803.5 kB
view details)
Built Distribution
autolgbm-0.0.3-py3-none-any.whl
(20.9 kB
view details)
File details
Details for the file autolgbm-0.0.3.tar.gz
.
File metadata
- Download URL: autolgbm-0.0.3.tar.gz
- Upload date:
- Size: 803.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3828a29e380a0c641e3170441a6f357689daa1055d842caf06cb6aa712b6fe86 |
|
MD5 | 8f13b9200dad85fc26d892eeca43058d |
|
BLAKE2b-256 | a25e8cee4edf11bfa93c0303294797ae3d52e47eac222f67f0a7a31029736336 |
File details
Details for the file autolgbm-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: autolgbm-0.0.3-py3-none-any.whl
- Upload date:
- Size: 20.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 689c0f45204999e5c081657e2ce74de4b24b7a21108740f522a56b6ab5199373 |
|
MD5 | 738f2e0c9a1b4980891a8faeb474dc25 |
|
BLAKE2b-256 | 9460e4758aa65eda1a2ea3e98ab35a6af5cd115360a6c3e8d6e8e62d51486c45 |