Skip to main content

AutoML for Text, Image, and Tabular Data

Project description

AutoML for Text, Image, and Tabular Data

Build Status Pypi Version GitHub license Downloads Upload Python Package

AutoGluon automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy machine learning and deep learning models on text, image, and tabular data.

Example

# First install package from terminal:
# python3 -m pip install -U pip
# python3 -m pip install -U setuptools wheel
# python3 -m pip install autogluon  # autogluon==0.3.1

from autogluon.tabular import TabularDataset, TabularPredictor
train_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/train.csv')
test_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/test.csv')
predictor = TabularPredictor(label='class').fit(train_data, time_limit=120)  # Fit models for 120s
leaderboard = predictor.leaderboard(test_data)
AutoGluon Task Quickstart API
TabularPredictor Quick Start API
TextPredictor Quick Start API
ImagePredictor Quick Start API
ObjectDetector Quick Start API

Resources

See the AutoGluon Website for documentation and instructions on:

Scientific Publications

Articles

Hands-on Tutorials

Train/Deploy AutoGluon in the Cloud

Citing AutoGluon

If you use AutoGluon in a scientific publication, please cite the following paper:

Erickson, Nick, et al. "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv preprint arXiv:2003.06505 (2020).

BibTeX entry:

@article{agtabular,
  title={AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data},
  author={Erickson, Nick and Mueller, Jonas and Shirkov, Alexander and Zhang, Hang and Larroy, Pedro and Li, Mu and Smola, Alexander},
  journal={arXiv preprint arXiv:2003.06505},
  year={2020}
}

If you are using AutoGluon Tabular's model distillation functionality, please cite the following paper:

Fakoor, Rasool, et al. "Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation." Advances in Neural Information Processing Systems 33 (2020).

BibTeX entry:

@article{agtabulardistill,
  title={Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation},
  author={Fakoor, Rasool and Mueller, Jonas W and Erickson, Nick and Chaudhari, Pratik and Smola, Alexander J},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  year={2020}
}

If you use AutoGluon's multimodal text+tabular functionality in a scientific publication, please cite the following paper:

Shi, Xingjian, et al. "Multimodal AutoML on Structured Tables with Text Fields." 8th ICML Workshop on Automated Machine Learning (AutoML). 2021.

BibTeX entry:

@inproceedings{agmultimodaltext,
  title={Multimodal AutoML on Structured Tables with Text Fields},
  author={Shi, Xingjian and Mueller, Jonas and Erickson, Nick and Li, Mu and Smola, Alex},
  booktitle={8th ICML Workshop on Automated Machine Learning (AutoML)},
  year={2021}
}

AutoGluon for Hyperparameter Optimization

AutoGluon also provides state-of-the-art tools for hyperparameter optimization, such as for example ASHA, Hyperband, Bayesian Optimization and BOHB.

To get started, checkout our paper "Model-based Asynchronous Hyperparameter and Neural Architecture Search" arXiv preprint arXiv:2003.10865 (2020).

@article{abohb,
  title={Model-based Asynchronous Hyperparameter and Neural Architecture Search},
  author={Klein, Aaron and Tiao, Louis and Lienart, Thibaut and Archambeau, Cedric and Seeger, Matthias},
  journal={arXiv preprint arXiv:2003.10865},
  year={2020}
}

License

This library is licensed under the Apache 2.0 License.

Contributing to AutoGluon

We are actively accepting code contributions to the AutoGluon project. If you are interested in contributing to AutoGluon, please read the Contributing Guide to get started.

Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autogluon.core-0.3.2b20211220.tar.gz (270.7 kB view details)

Uploaded Source

Built Distribution

autogluon.core-0.3.2b20211220-py3-none-any.whl (335.7 kB view details)

Uploaded Python 3

File details

Details for the file autogluon.core-0.3.2b20211220.tar.gz.

File metadata

  • Download URL: autogluon.core-0.3.2b20211220.tar.gz
  • Upload date:
  • Size: 270.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.12

File hashes

Hashes for autogluon.core-0.3.2b20211220.tar.gz
Algorithm Hash digest
SHA256 53b0c70f6cc9287f061d83b936bb3aaa174d84abf228fa0a098937a3283e8f08
MD5 d940ef0e666a55a2c50a5eb2bd8e3ffc
BLAKE2b-256 3523baddd6dfca47fca750eb09f9954955d29cb96a5495dfec949fd69a9c62c4

See more details on using hashes here.

File details

Details for the file autogluon.core-0.3.2b20211220-py3-none-any.whl.

File metadata

  • Download URL: autogluon.core-0.3.2b20211220-py3-none-any.whl
  • Upload date:
  • Size: 335.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.12

File hashes

Hashes for autogluon.core-0.3.2b20211220-py3-none-any.whl
Algorithm Hash digest
SHA256 07c926cfc9c9ea05721d9161a99df38416e4e16d61400d5804192cee20390b3f
MD5 8360b021e9cbfd50d9b0acdb86ea800b
BLAKE2b-256 1241a0fdbc60eba51da58f16a0062758b9b1a4c2a39fb50d7d3e61dab68798ec

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page