Skip to main content

AutoML for Image, Text, and Tabular Data

Project description

AutoML for Image, Text, Time Series, and Tabular Data

Latest Release Continuous Integration Platform Tests Broken Link Checker Python Versions GitHub license Downloads Twitter

Install Instructions | Documentation (Stable | Latest)

AutoGluon automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy machine learning and deep learning models on image, text, time series, and tabular data.

Example

# First install package from terminal:
# pip install -U pip
# pip install -U setuptools wheel
# pip install autogluon  # autogluon==0.8.2

from autogluon.tabular import TabularDataset, TabularPredictor
train_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/train.csv')
test_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/test.csv')
predictor = TabularPredictor(label='class').fit(train_data, time_limit=120)  # Fit models for 120s
leaderboard = predictor.leaderboard(test_data)
AutoGluon Task Quickstart API
TabularPredictor Quick Start API
MultiModalPredictor Quick Start API
TimeSeriesPredictor Quick Start API

Resources

See the AutoGluon Website for documentation and instructions on:

Refer to the AutoGluon Roadmap for details on upcoming features and releases.

Scientific Publications

Articles

Hands-on Tutorials

Train/Deploy AutoGluon in the Cloud

Contributing to AutoGluon

We are actively accepting code contributions to the AutoGluon project. If you are interested in contributing to AutoGluon, please read the Contributing Guide to get started.

Citing AutoGluon

If you use AutoGluon in a scientific publication, please cite the following paper:

Erickson, Nick, et al. "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv preprint arXiv:2003.06505 (2020).

BibTeX entry:

@article{agtabular,
  title={AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data},
  author={Erickson, Nick and Mueller, Jonas and Shirkov, Alexander and Zhang, Hang and Larroy, Pedro and Li, Mu and Smola, Alexander},
  journal={arXiv preprint arXiv:2003.06505},
  year={2020}
}

If you are using AutoGluon Tabular's model distillation functionality, please cite the following paper:

Fakoor, Rasool, et al. "Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation." Advances in Neural Information Processing Systems 33 (2020).

BibTeX entry:

@article{agtabulardistill,
  title={Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation},
  author={Fakoor, Rasool and Mueller, Jonas W and Erickson, Nick and Chaudhari, Pratik and Smola, Alexander J},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  year={2020}
}

If you use AutoGluon's multimodal text+tabular functionality in a scientific publication, please cite the following paper:

Shi, Xingjian, et al. "Multimodal AutoML on Structured Tables with Text Fields." 8th ICML Workshop on Automated Machine Learning (AutoML). 2021.

BibTeX entry:

@inproceedings{agmultimodaltext,
  title={Multimodal AutoML on Structured Tables with Text Fields},
  author={Shi, Xingjian and Mueller, Jonas and Erickson, Nick and Li, Mu and Smola, Alex},
  booktitle={8th ICML Workshop on Automated Machine Learning (AutoML)},
  year={2021}
}

If you use AutoGluon's time series forecasting functionality in a scientific publication, please cite the following paper:

@inproceedings{agtimeseries,
  title={{AutoGluon-TimeSeries}: {AutoML} for Probabilistic Time Series Forecasting},
  author={Shchur, Oleksandr and Turkmen, Caner and Erickson, Nick and Shen, Huibin and Shirkov, Alexander and Hu, Tony and Wang, Yuyang},
  booktitle={International Conference on Automated Machine Learning},
  year={2023}
}

AutoGluon for Hyperparameter Optimization

AutoGluon's state-of-the-art tools for hyperparameter optimization, such as ASHA, Hyperband, Bayesian Optimization and BOHB have moved to the stand-alone package syne-tune.

To learn more, checkout our paper "Model-based Asynchronous Hyperparameter and Neural Architecture Search" arXiv preprint arXiv:2003.10865 (2020).

@article{abohb,
  title={Model-based Asynchronous Hyperparameter and Neural Architecture Search},
  author={Klein, Aaron and Tiao, Louis and Lienart, Thibaut and Archambeau, Cedric and Seeger, Matthias},
  journal={arXiv preprint arXiv:2003.10865},
  year={2020}
}

License

This library is licensed under the Apache 2.0 License.

Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autogluon.timeseries-0.8.3b20231103.tar.gz (94.6 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file autogluon.timeseries-0.8.3b20231103.tar.gz.

File metadata

File hashes

Hashes for autogluon.timeseries-0.8.3b20231103.tar.gz
Algorithm Hash digest
SHA256 339f76f8ab49fc1a1beec8fe3dce0ba44b57427edeee3d65eb72824c582c99c3
MD5 1e9d63ed0164b8e2a1db95fec039c7a5
BLAKE2b-256 760a1d54a3e8f00f272639d8678e76b2c801571b252d8fe7f78346888a271aeb

See more details on using hashes here.

File details

Details for the file autogluon.timeseries-0.8.3b20231103-py3-none-any.whl.

File metadata

File hashes

Hashes for autogluon.timeseries-0.8.3b20231103-py3-none-any.whl
Algorithm Hash digest
SHA256 a9465e43aad987254c4233db71fb2ec1fb537dd53b4ca8e4600b7fc3e6ffa835
MD5 e6f3b1d43744adde25cc6ef8b0461b8f
BLAKE2b-256 2f480eb994f95df1939fd688aa1fe8ea6091f11c450159db2d183d99a351b56d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page