Skip to main content

Python automated machine learning framework.

Project description

https://travis-ci.com/lukapecnik/NiaAML.svg?branch=master https://coveralls.io/repos/github/lukapecnik/NiaAML/badge.svg?branch=travisCI_integration https://img.shields.io/pypi/v/niaaml.svg https://img.shields.io/pypi/pyversions/niaaml.svg https://img.shields.io/github/license/lukapecnik/niaaml.svg https://zenodo.org/badge/289322337.svg https://joss.theoj.org/papers/10.21105/joss.02949/status.svg

NiaAML is an automated machine learning Python framework based on nature-inspired algorithms for optimization. The name comes from the automated machine learning method of the same name [1]. Its goal is to efficiently compose the best possible classification pipeline for the given task using components on the input. The components are divided into three groups: feature seletion algorithms, feature transformation algorithms and classifiers. The framework uses nature-inspired algorithms for optimization to choose the best set of components for the classification pipeline on the output and optimize their parameters. We use NiaPy framework for the optimization process which is a popular Python collection of nature-inspired algorithms. The NiaAML framework is easy to use and customize or expand to suit your needs.

The NiaAML framework allows you not only to run full pipeline optimization, but also separate implemented components such as classifiers, feature selection algorithms, etc. It supports numerical and categorical features as well as missing values in datasets.

Installation

pip

Install NiaAML with pip3:

pip3 install niaaml

In case you would like to try out the latest pre-release version of the framework, install it using:

pip3 install niaaml --pre

Install From Source

In case you want to install directly from the source code, use:

git clone https://github.com/lukapecnik/NiaAML.git
cd NiaAML
python3 setup.py install

Graphical User Interface

You can find a simple graphical user interface for NiaAML package here.

Usage

See the project’s repository for usage examples.

Components

In the following sections you can see a list of currently implemented components divided into groups: classifiers, feature selection algorithms and feature transformation algorithms. At the end you can also see a list of currently implemented fitness functions for the optimization process, categorical features’ encoders, and missing values’ imputers.

Classifiers

  • Adaptive Boosting (AdaBoost),

  • Bagging (Bagging),

  • Extremely Randomized Trees (ExtremelyRandomizedTrees),

  • Linear SVC (LinearSVC),

  • Multi Layer Perceptron (MultiLayerPerceptron),

  • Random Forest Classifier (RandomForest),

  • Decision Tree Classifier (DecisionTree),

  • K-Neighbors Classifier (KNeighbors),

  • Gaussian Process Classifier (GaussianProcess),

  • Gaussian Naive Bayes (GaussianNB),

  • Quadratic Discriminant Analysis (QuadraticDiscriminantAnalysis).

Feature Selection Algorithms

  • Select K Best (SelectKBest),

  • Select Percentile (SelectPercentile),

  • Variance Threshold (VarianceThreshold).

Nature-Inspired

  • Bat Algorithm (BatAlgorithm),

  • Differential Evolution (DifferentialEvolution),

  • Self-Adaptive Differential Evolution (jDEFSTH),

  • Grey Wolf Optimizer (GreyWolfOptimizer),

  • Particle Swarm Optimization (ParticleSwarmOptimization).

Feature Transformation Algorithms

  • Normalizer (Normalizer),

  • Standard Scaler (StandardScaler),

  • Maximum Absolute Scaler (MaxAbsScaler),

  • Quantile Transformer (QuantileTransformer),

  • Robust Scaler (RobustScaler).

Fitness Functions based on

  • Accuracy (Accuracy),

  • Cohen’s kappa (CohenKappa),

  • F1-Score (F1),

  • Precision (Precision).

Categorical Feature Encoders

  • One-Hot Encoder (OneHotEncoder).

Feature Imputers

  • Simple Imputer (SimpleImputer).

Licence

This package is distributed under the MIT License. This license can be found online at http://www.opensource.org/licenses/MIT.

Disclaimer

This framework is provided as-is, and there are no guarantees that it fits your purposes or that it is bug-free. Use it at your own risk!

References

[1] Iztok Fister Jr., Milan Zorman, Dušan Fister, Iztok Fister. Continuous optimizers for automatic design and evaluation of classification pipelines. In: Frontier applications of nature inspired computation. Springer tracts in nature-inspired computing, pp.281-301, 2020.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

NiaAML-1.1.6.tar.gz (42.8 kB view details)

Uploaded Source

Built Distribution

NiaAML-1.1.6-py3-none-any.whl (98.4 kB view details)

Uploaded Python 3

File details

Details for the file NiaAML-1.1.6.tar.gz.

File metadata

  • Download URL: NiaAML-1.1.6.tar.gz
  • Upload date:
  • Size: 42.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.5 CPython/3.8.8 Windows/10

File hashes

Hashes for NiaAML-1.1.6.tar.gz
Algorithm Hash digest
SHA256 1e2862e64fbbaab7c420cb436e161b7804035afe1e180c13dfea4aaa9fef9d25
MD5 c7cd03950369f9b6b86c2bc7fc790372
BLAKE2b-256 760d6e21ccc30d85279e4257653a93481ad3750bc15fa5a1e0c7d861822f089c

See more details on using hashes here.

File details

Details for the file NiaAML-1.1.6-py3-none-any.whl.

File metadata

  • Download URL: NiaAML-1.1.6-py3-none-any.whl
  • Upload date:
  • Size: 98.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.5 CPython/3.8.8 Windows/10

File hashes

Hashes for NiaAML-1.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 a573a5b80a4219104550669b8424f0091d3b6511a68325f963353be01476c9a7
MD5 cc5c304e5dcac1ffa39916fcad4d032f
BLAKE2b-256 3828ef9f4b9f1997b8e3e81aeac0fa889f972ba98052342514f126044a08be1e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page