Skip to main content

Machine learning regression off-the-shelf

Project description

Machine learning regression (mlregression)

Machine Learning Regression (mlregrresion) is an off-the-shelf implementation of the most popular ML methods that automatically takes care of fitting and parameter tuning.

Currently, the fully implemented models include:

  • Ensemble trees (Random forests, XGBoost, LightGBM, GradientBoostingRegressor, ExtraTreesRegressor)
  • Penalized regression (Ridge, Lasso, ElasticNet, Lars, LassoLars)
  • Neural nets (Simple neural nets with 1-5 hidden layers, rely activation, and early stopping)

NB! When using penalized regressions, consider using the native CV-implementation from scikit-learn for speed. See Example 6 below.

In addition, all scikit-learn regressors can be supplied (e.g., HuberRegressor or BayesianRidge), but then one has to provide a parameter grid as well!

Please contact the authors below if you find any bugs or have any suggestions for improvement. Thank you!

Author: Nicolaj Søndergaard Mühlbach (n.muhlbach at gmail dot com, muhlbach at mit dot edu)

Code dependencies

This code has the following dependencies:

  • Python 3.6+
  • numpy 1.19+
  • pandas 1.3+
  • scikit-learn 1+
  • xgboost 1.3+
  • lightgbm 3.2+

Usage

We demonstrate the use of mlregression below, using random forests, xgboost, and lightGBM as underlying regressors.

#------------------------------------------------------------------------------
# Libraries
#------------------------------------------------------------------------------
# Standard
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split

# This library
from mlregression.mlreg import MLRegressor
from mlregression.mlreg import RF
from mlregression.estimator.boosting import XGBRegressor, LGBMegressor

#------------------------------------------------------------------------------
# Data
#------------------------------------------------------------------------------
# Generate data
X, y = make_regression(n_samples=500,
                       n_features=10, 
                       n_informative=5,
                       n_targets=1,
                       bias=0.0,
                       coef=False,
                       random_state=1991)

X_train, X_test, y_train, y_test = train_test_split(X, y)

#------------------------------------------------------------------------------
# Example 1: Main use of MLRegressor
#------------------------------------------------------------------------------
# Instantiate model and specify the underlying regressor by a string
mlreg = MLRegressor(estimator="RandomForestRegressor",
                    max_n_models=2)

# Fit
mlreg.fit(X=X_train, y=y_train)

# Predict
y_hat = mlreg.predict(X=X_test)

# Access all the usual attributes
mlreg.best_score_
mlreg.best_estimator_

# Compute the score
mlreg.score(X=X_test,y=y_test)

#------------------------------------------------------------------------------
# Example 2: RF
#------------------------------------------------------------------------------
# Instantiate model
rf = RF(max_n_models=2)

# Fit
rf.fit(X=X_train, y=y_train)

# Predict and score
rf.score(X=X_test, y=y_test)

#------------------------------------------------------------------------------
# Example 3: XGBoost
#------------------------------------------------------------------------------
# Instantiate model
xgb = MLRegressor(estimator=XGBRegressor(),
                  max_n_models=2)

# Fit
xgb.fit(X=X_train, y=y_train)

# Predict and score
xgb.score(X=X_test, y=y_test)

#------------------------------------------------------------------------------
# Example 4: LightGBM
#------------------------------------------------------------------------------
# Instantiate model
lgbm = MLRegressor(estimator=LGBMegressor(),
                  max_n_models=2)

# Fit
lgbm.fit(X=X_train, y=y_train)

# Predict and score
lgbm.score(X=X_test, y=y_test)

#------------------------------------------------------------------------------
# Example 5: Neural Nets
#------------------------------------------------------------------------------
# Instantiate model
nn = MLRegressor(estimator="MLPRegressor",
                  max_n_models=2)

# Fit
nn.fit(X=X_train, y=y_train)

# Predict and score
nn.score(X=X_test, y=y_test)

#------------------------------------------------------------------------------
# Example 6: LassoCV/RidgeCV/ElasticNetCV/LarsCV/LassoLarsCV (native scikit-learn implementation)
#------------------------------------------------------------------------------
# Instantiate model
penalized = MLRegressor(estimator="LassoCV")

# Fit
penalized.fit(X=X_train, y=y_train)

# Predict and score
penalized.score(X=X_test, y=y_test)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlregression-0.0.10.tar.gz (20.4 kB view details)

Uploaded Source

Built Distribution

mlregression-0.0.10-py3-none-any.whl (22.0 kB view details)

Uploaded Python 3

File details

Details for the file mlregression-0.0.10.tar.gz.

File metadata

  • Download URL: mlregression-0.0.10.tar.gz
  • Upload date:
  • Size: 20.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for mlregression-0.0.10.tar.gz
Algorithm Hash digest
SHA256 3087f50e7edc603e232c1a3200dfb7227acef017ee1df34b3060bdc425e463d9
MD5 cb547782ecc40428afb4a83166365047
BLAKE2b-256 60e30345427e99c91a1b411060433c7a5fc4d1af8aaa4a588e3c967c25f69966

See more details on using hashes here.

File details

Details for the file mlregression-0.0.10-py3-none-any.whl.

File metadata

  • Download URL: mlregression-0.0.10-py3-none-any.whl
  • Upload date:
  • Size: 22.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for mlregression-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 f6c1652aea86dd8b78fdea2aec4d03fc9628c98a4ba98383a039ae26e0a87dcd
MD5 9523db749f1b4ea3f74feeaef8900573
BLAKE2b-256 cfaebc403694893f84ef7f982a70c72362f9aceca8adf55f632b75ca61895ee6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page