Skip to main content

Fast and customizable framework for automatic ML model creation (AutoML)

Project description

LightAutoML - automatic model creation framework

Telegram PyPI - Downloads Read the Docs Black

LightAutoML (LAMA) is an AutoML framework which provides automatic model creation for the following tasks:

  • binary classification
  • multiclass classification
  • regression

Current version of the package handles datasets that have independent samples in each row. I.e. each row is an object with its specific features and target. Multitable datasets and sequences are a work in progress :)

Note: we use AutoWoE library to automatically create interpretable models.

Authors: Alexander Ryzhkov, Anton Vakhrushev, Dmitry Simakov, Vasilii Bunakov, Rinchin Damdinov, Alexander Kirilin, Pavel Shvets.

Documentation of LightAutoML is available here, you can also generate it.

(New feature) GPU pipeline

Full GPU pipeline for LightAutoML currently available for developers testing (still in progress). The code and tutorials available here

Table of Contents

Installation

To install LAMA framework on your machine from PyPI, execute following commands:

# Install base functionality:

pip install -U lightautoml

# For partial installation use corresponding option.
# Extra dependecies: [nlp, cv, report]
# Or you can use 'all' to install everything

pip install -U lightautoml[nlp]

Additionaly, run following commands to enable pdf report generation:

# MacOS
brew install cairo pango gdk-pixbuf libffi

# Debian / Ubuntu
sudo apt-get install build-essential libcairo2 libpango-1.0-0 libpangocairo-1.0-0 libgdk-pixbuf2.0-0 libffi-dev shared-mime-info

# Fedora
sudo yum install redhat-rpm-config libffi-devel cairo pango gdk-pixbuf2

# Windows
# follow this tutorial https://weasyprint.readthedocs.io/en/stable/install.html#windows

Back to top

Quick tour

Let's solve the popular Kaggle Titanic competition below. There are two main ways to solve machine learning problems using LightAutoML:

  • Use ready preset for tabular data:
import pandas as pd
from sklearn.metrics import f1_score

from lightautoml.automl.presets.tabular_presets import TabularAutoML
from lightautoml.tasks import Task

df_train = pd.read_csv('../input/titanic/train.csv')
df_test = pd.read_csv('../input/titanic/test.csv')

automl = TabularAutoML(
    task = Task(
        name = 'binary',
        metric = lambda y_true, y_pred: f1_score(y_true, (y_pred > 0.5)*1))
)
oof_pred = automl.fit_predict(
    df_train,
    roles = {'target': 'Survived', 'drop': ['PassengerId']}
)
test_pred = automl.predict(df_test)

pd.DataFrame({
    'PassengerId':df_test.PassengerId,
    'Survived': (test_pred.data[:, 0] > 0.5)*1
}).to_csv('submit.csv', index = False)

LighAutoML framework has a lot of ready-to-use parts and extensive customization options, to learn more check out the resources section.

Back to top

Resources

Kaggle kernel examples of LightAutoML usage:

Google Colab tutorials and other examples:

  • Tutorial_1_basics.ipynb - get started with LightAutoML on tabular data.
  • Tutorial_2_WhiteBox_AutoWoE.ipynb - creating interpretable models.
  • Tutorial_3_sql_data_source.ipynb - shows how to use LightAutoML presets (both standalone and time utilized variants) for solving ML tasks on tabular data from SQL data base instead of CSV.
  • Tutorial_4_NLP_Interpretation.ipynb - example of using TabularNLPAutoML preset, LimeTextExplainer.
  • Tutorial_5_uplift.ipynb - shows how to use LightAutoML for a uplift-modeling task.
  • Tutorial_6_custom_pipeline.ipynb - shows how to create your own pipeline from specified blocks: pipelines for feature generation and feature selection, ML algorithms, hyperparameter optimization etc.
  • Tutorial_7_ICE_and_PDP_interpretation.ipynb - shows how to obtain local and global interpretation of model results using ICE and PDP approaches.

Note 1: for production you have no need to use profiler (which increase work time and memory consomption), so please do not turn it on - it is in off state by default

Note 2: to take a look at this report after the run, please comment last line of demo with report deletion command.

Courses, videos and papers

Back to top

Contributing to LightAutoML

If you are interested in contributing to LightAutoML, please read the Contributing Guide to get started.

Back to top

License

This project is licensed under the Apache License, Version 2.0. See LICENSE file for more details.

Back to top

For developers

Installation from source code

First of all you need to install git and poetry.

# Load LAMA source code
git clone https://github.com/AILab-MLTools/LightAutoML.git

cd LightAutoML/

# !!!Choose only one item!!!

# 1. Global installation: Don't create virtual environment
poetry config virtualenvs.create false --local

# 2. Recommended: Create virtual environment inside your project directory
poetry config virtualenvs.in-project true

# For more information read poetry docs

# Install LAMA
poetry lock
poetry install

Build your own custom pipeline:

import pandas as pd
from sklearn.metrics import f1_score

from lightautoml.automl.presets.tabular_presets import TabularAutoML
from lightautoml.tasks import Task

df_train = pd.read_csv('../input/titanic/train.csv')
df_test = pd.read_csv('../input/titanic/test.csv')

# define that machine learning problem is binary classification
task = Task("binary")

reader = PandasToPandasReader(task, cv=N_FOLDS, random_state=RANDOM_STATE)

# create a feature selector
model0 = BoostLGBM(
    default_params={'learning_rate': 0.05, 'num_leaves': 64,
    'seed': 42, 'num_threads': N_THREADS}
)
pipe0 = LGBSimpleFeatures()
mbie = ModelBasedImportanceEstimator()
selector = ImportanceCutoffSelector(pipe0, model0, mbie, cutoff=0)

# build first level pipeline for AutoML
pipe = LGBSimpleFeatures()
# stop after 20 iterations or after 30 seconds
params_tuner1 = OptunaTuner(n_trials=20, timeout=30)
model1 = BoostLGBM(
    default_params={'learning_rate': 0.05, 'num_leaves': 128,
    'seed': 1, 'num_threads': N_THREADS}
)
model2 = BoostLGBM(
    default_params={'learning_rate': 0.025, 'num_leaves': 64,
    'seed': 2, 'num_threads': N_THREADS}
)
pipeline_lvl1 = MLPipeline([
    (model1, params_tuner1),
    model2
], pre_selection=selector, features_pipeline=pipe, post_selection=None)

# build second level pipeline for AutoML
pipe1 = LGBSimpleFeatures()
model = BoostLGBM(
    default_params={'learning_rate': 0.05, 'num_leaves': 64,
    'max_bin': 1024, 'seed': 3, 'num_threads': N_THREADS},
    freeze_defaults=True
)
pipeline_lvl2 = MLPipeline([model], pre_selection=None, features_pipeline=pipe1,
 post_selection=None)

# build AutoML pipeline
automl = AutoML(reader, [
    [pipeline_lvl1],
    [pipeline_lvl2],
], skip_conn=False)

# train AutoML and get predictions
oof_pred = automl.fit_predict(df_train, roles = {'target': 'Survived', 'drop': ['PassengerId']})
test_pred = automl.predict(df_test)

pd.DataFrame({
    'PassengerId':df_test.PassengerId,
    'Survived': (test_pred.data[:, 0] > 0.5)*1
}).to_csv('submit.csv', index = False)

Back to top

Support and feature requests

Seek prompt advice at Telegram group.

Open bug reports and feature requests on GitHub issues.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

LightAutoML-0.3.4.tar.gz (225.0 kB view details)

Uploaded Source

Built Distribution

LightAutoML-0.3.4-py3-none-any.whl (296.2 kB view details)

Uploaded Python 3

File details

Details for the file LightAutoML-0.3.4.tar.gz.

File metadata

  • Download URL: LightAutoML-0.3.4.tar.gz
  • Upload date:
  • Size: 225.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.12 CPython/3.6.9 Linux/4.15.0-143-generic

File hashes

Hashes for LightAutoML-0.3.4.tar.gz
Algorithm Hash digest
SHA256 53147c58642760f5c5cb8247238f4eea9200972fe6deb75500d7db58d7827dcc
MD5 69322fa23e24fbcd406dd559de89e569
BLAKE2b-256 3cfb50b745f44a129a9d78d497fce0f2f7f4d00a6ec482d2e3b46db4dfee57d3

See more details on using hashes here.

File details

Details for the file LightAutoML-0.3.4-py3-none-any.whl.

File metadata

  • Download URL: LightAutoML-0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 296.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.12 CPython/3.6.9 Linux/4.15.0-143-generic

File hashes

Hashes for LightAutoML-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 f97652aa4d29beb6b7a8ee528aea121eb35bde6371ebffa7b0ecfc2e45cdf9d3
MD5 ade192da978a31c5f2bbbeb58aa105c8
BLAKE2b-256 de285bc660706456880b313a9cfffd170b3abe5bc0ec59be68cbfee2e7604abd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page