Skip to main content

Auto machine learning, deep learning library in Python.

Project description

automlkiller

Automated Machine Learning

Usage

  1. Step 1: Load data and Preprocessing
model = AUTOML(X, y,
                cleancolumnname = {},
                datatype = {"categorical_columns": [], "numeric_columns":[], "time_columns":[]},
                simpleimputer =  {"numeric_strategy": "mean", "categorical_strategy": "most_frequent"},
                zeronearzerovariance = {"threshold_first" : 0.1, "threshold_second": 20},
                categoryencoder = {"cols": [], "method": "targetencoder"},
                groupsimilarfeature = {"group_name": [], "list_of_group_feature": []},
                binning = {"features_to_discretize": []},
                maketimefeature = {"time_columns": [], "list_of_feature": ['month',  'dayofweek', 'weekday', 'is_month_end', 'is_month_start', 'hour']},
                scaling = {"method": "zscore", "numeric_columns": []},
                # outlier = {"methods": ["pca", "iforest", "knn"], "contamination": 0.2},
                removeperfectmulticollinearity = {},
                makenonlinearfeature = {"polynomial_columns": [], "degree": 2, "interaction_only": False, "include_bias": False, "other_nonlinear_feature": ["sin", "cos", "tan"]},
                # rfe = {"estimator": None, "step": 1, "min_features_to_select": 3, "cv": 3},
                # reducedimension = {"method": "pca_linear", "n_components": 0.99}
                )
  1. Step 2: Training Model
model.create_model(estimator=['classification-lgbmclassifier',
                            # 'classification-kneighborsclassifier',
                            'classification-logisticregression',
                            # 'classification-xgbclassifier',
                            # 'classification-catboostclassifier',
                            # 'classification-randomforestclassifier'
                            ],
                verbose = True,
                n_jobs = 2,
                cv = 2,
                estimator_params = {
                            'classification-lgbmclassifier': {'n_jobs': 8},
                },
                scoring = ['accuracy', 'roc_auc', 'recall', 'precision', 'f1']
            )
model.ensemble_model(scoring = ['accuracy'])
model.voting_model(scoring = ['accuracy'])
model.stacking_model(scoring = ['accuracy'])
  1. Step 3: Model Performance
model.report_tensorboard()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

automlkiller-0.0.29.tar.gz (36.2 kB view details)

Uploaded Source

Built Distribution

automlkiller-0.0.29-py3-none-any.whl (83.3 kB view details)

Uploaded Python 3

File details

Details for the file automlkiller-0.0.29.tar.gz.

File metadata

  • Download URL: automlkiller-0.0.29.tar.gz
  • Upload date:
  • Size: 36.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.54.0 CPython/3.8.6

File hashes

Hashes for automlkiller-0.0.29.tar.gz
Algorithm Hash digest
SHA256 8ea53394925a149464f442e799b6a2be4a4a7ed4d398f347185c38aa237323d1
MD5 a05f334254ace5de5f7ca39f79a453d6
BLAKE2b-256 e84d6bf0c0e6d71e8df72cbbf4b29270271908cfe8967829d24179bd01eae047

See more details on using hashes here.

File details

Details for the file automlkiller-0.0.29-py3-none-any.whl.

File metadata

  • Download URL: automlkiller-0.0.29-py3-none-any.whl
  • Upload date:
  • Size: 83.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.54.0 CPython/3.8.6

File hashes

Hashes for automlkiller-0.0.29-py3-none-any.whl
Algorithm Hash digest
SHA256 f98e405c37b2b010a25cc91243fb3ec1d2c742e6df434fbb0a693db484833929
MD5 bfa455b8cc0a7646df1643c4322adf7e
BLAKE2b-256 e5712e3aa2b19f879135ad063d6c7486268cce5c6a57e839a83d6f01cc684d7d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page