AutoML for Image, Text, and Tabular Data
Project description
AutoML for Image, Text, and Tabular Data
Install Instructions | Documentation (Stable | Latest)
AutoGluon automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy machine learning and deep learning models on image, text, and tabular data.
Example
# First install package from terminal:
# pip install -U pip
# pip install -U setuptools wheel
# pip install autogluon # autogluon==0.5.2
from autogluon.tabular import TabularDataset, TabularPredictor
train_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/train.csv')
test_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/test.csv')
predictor = TabularPredictor(label='class').fit(train_data, time_limit=120) # Fit models for 120s
leaderboard = predictor.leaderboard(test_data)
AutoGluon Task | Quickstart | API |
---|---|---|
TabularPredictor | ||
TextPredictor | ||
ImagePredictor | ||
ObjectDetector | ||
MultiModalPredictor |
Resources
See the AutoGluon Website for documentation and instructions on:
-
- Tips to maximize accuracy (if benchmarking, make sure to run
fit()
with argumentpresets='best_quality'
).
- Tips to maximize accuracy (if benchmarking, make sure to run
Refer to the AutoGluon Roadmap for details on upcoming features and releases.
Scientific Publications
- AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data (Arxiv, 2020)
- Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation (NeurIPS, 2020)
- Multimodal AutoML on Structured Tables with Text Fields (ICML AutoML Workshop, 2021)
Articles
- AutoGluon for tabular data: 3 lines of code to achieve top 1% in Kaggle competitions (AWS Open Source Blog, Mar 2020)
- Accurate image classification in 3 lines of code with AutoGluon (Medium, Feb 2020)
- AutoGluon overview & example applications (Towards Data Science, Dec 2019)
Hands-on Tutorials
Train/Deploy AutoGluon in the Cloud
- AutoGluon-Tabular on AWS Marketplace
- AutoGluon-Tabular on Amazon SageMaker
- AutoGluon Deep Learning Containers
Contributing to AutoGluon
We are actively accepting code contributions to the AutoGluon project. If you are interested in contributing to AutoGluon, please read the Contributing Guide to get started.
Citing AutoGluon
If you use AutoGluon in a scientific publication, please cite the following paper:
Erickson, Nick, et al. "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv preprint arXiv:2003.06505 (2020).
BibTeX entry:
@article{agtabular,
title={AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data},
author={Erickson, Nick and Mueller, Jonas and Shirkov, Alexander and Zhang, Hang and Larroy, Pedro and Li, Mu and Smola, Alexander},
journal={arXiv preprint arXiv:2003.06505},
year={2020}
}
If you are using AutoGluon Tabular's model distillation functionality, please cite the following paper:
Fakoor, Rasool, et al. "Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation." Advances in Neural Information Processing Systems 33 (2020).
BibTeX entry:
@article{agtabulardistill,
title={Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation},
author={Fakoor, Rasool and Mueller, Jonas W and Erickson, Nick and Chaudhari, Pratik and Smola, Alexander J},
journal={Advances in Neural Information Processing Systems},
volume={33},
year={2020}
}
If you use AutoGluon's multimodal text+tabular functionality in a scientific publication, please cite the following paper:
Shi, Xingjian, et al. "Multimodal AutoML on Structured Tables with Text Fields." 8th ICML Workshop on Automated Machine Learning (AutoML). 2021.
BibTeX entry:
@inproceedings{agmultimodaltext,
title={Multimodal AutoML on Structured Tables with Text Fields},
author={Shi, Xingjian and Mueller, Jonas and Erickson, Nick and Li, Mu and Smola, Alex},
booktitle={8th ICML Workshop on Automated Machine Learning (AutoML)},
year={2021}
}
AutoGluon for Hyperparameter Optimization
AutoGluon's state-of-the-art tools for hyperparameter optimization, such as ASHA, Hyperband, Bayesian Optimization and BOHB have moved to the stand-alone package syne-tune.
To learn more, checkout our paper "Model-based Asynchronous Hyperparameter and Neural Architecture Search" arXiv preprint arXiv:2003.10865 (2020).
@article{abohb,
title={Model-based Asynchronous Hyperparameter and Neural Architecture Search},
author={Klein, Aaron and Tiao, Louis and Lienart, Thibaut and Archambeau, Cedric and Seeger, Matthias},
journal={arXiv preprint arXiv:2003.10865},
year={2020}
}
License
This library is licensed under the Apache 2.0 License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file autogluon.vision-0.5.3b20220820.tar.gz
.
File metadata
- Download URL: autogluon.vision-0.5.3b20220820.tar.gz
- Upload date:
- Size: 35.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9910ec997c33e0463b8c1bd5d682f9f867be892a98a4625cc456e46adfdf271c |
|
MD5 | ee43169dddff7cb992d65495218b7698 |
|
BLAKE2b-256 | c9f64c2f655aa7169a55b6423df3003f7c881c0a30f16f7f250827f4fe69b51a |
File details
Details for the file autogluon.vision-0.5.3b20220820-py3-none-any.whl
.
File metadata
- Download URL: autogluon.vision-0.5.3b20220820-py3-none-any.whl
- Upload date:
- Size: 49.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c33028e341310378a8c5a21646a3b54d6392a0b7d86b3fc918648fc3f9b100bf |
|
MD5 | 8473078b298a33c52894f258d68941a6 |
|
BLAKE2b-256 | 23663899f2ccbe99491f65575d0f0d68310b3e55a9529a29d8155e8714740c4a |