Multi Agent System for Multimodal ML Automation
Project description
AutoGluon Assistant (aka MLZero) |
|
Official implementation of MLZero: A Multi-Agent System for End-to-end Machine Learning Automation
AutoGluon Assistant (aka MLZero) is a multi-agent system that automates end-to-end multimodal machine learning or deep learning workflows by transforming raw multimodal data into high-quality ML solutions with zero human intervention. Leveraging specialized perception agents, dual-memory modules, and iterative code generation, it handles diverse data formats while maintaining high success rates across complex ML tasks.
💾 Installation
AutoGluon Assistant is supported on Python 3.8 - 3.11 and is available on Linux (will fix dependency issues for MacOS and Windows by our next official release).
You can install from source (new version will be released to PyPI soon):
pip install uv
uv pip install git+https://github.com/autogluon/autogluon-assistant.git
Quick Start
For detailed usage instructions, Anthropic/Azure/OpenAI setup, and advanced configuration options, see our Getting Started Tutorial.
1. API Setup
MLZero uses AWS Bedrock by default. Configure your AWS credentials:
export AWS_DEFAULT_REGION="<your-region>"
export AWS_ACCESS_KEY_ID="<your-access-key>"
export AWS_SECRET_ACCESS_KEY="<your-secret-key>"
We also support Anthropic, Azure, and OpenAI. Support for more LLM providers (e.g. DeepSeek, etc.) will be added soon.
2.1 CLI
mlzero -i <input_data_folder> [-t <optional_user_instructions>]
2.2 Web UI
mlzero-backend # command to start backend
mlzero-frontend # command to start frontend on 8509(default)
- Configure: Set your model provider and credentials in settings
- Upload & Describe: Drag your data folder into the chat input box, then type what you want to accomplish and press Enter
2.3 MCP (Model Context Protocol)
Note: The system can run on a single machine or distributed across multiple machines (e.g., server on EC2, client on local).
- Start the server
cd autogluon-assistant
mlzero-backend # command to start backend
mlzero-mcp-server # This will start the service—run it in a new terminal.
- Start the client
cd autogluon-assistant
mlzero-mcp-client
Note: You may need to set up port tunneling to expose your local MCP Client Server (port 8005) if you want to use it with remote LLM services (e.g., Claude API, OpenAI API).
2.4 Python API
from autogluon.assistant.coding_agent import run_agent
run_agent(
input_data_folder=<your-input-folder>,
output_folder=<your-output-folder>,
# more args ...
)
Citation
If you use Autogluon Assistant (MLZero) in your research, please cite our paper:
@misc{fang2025mlzeromultiagentendtoendmachine,
title={MLZero: A Multi-Agent System for End-to-end Machine Learning Automation},
author={Haoyang Fang and Boran Han and Nick Erickson and Xiyuan Zhang and Su Zhou and Anirudh Dagar and Jiani Zhang and Ali Caner Turkmen and Cuixiong Hu and Huzefa Rangwala and Ying Nian Wu and Bernie Wang and George Karypis},
year={2025},
eprint={2505.13941},
archivePrefix={arXiv},
primaryClass={cs.MA},
url={https://arxiv.org/abs/2505.13941},
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file autogluon_assistant-1.0.0.tar.gz.
File metadata
- Download URL: autogluon_assistant-1.0.0.tar.gz
- Upload date:
- Size: 2.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d22bc743409bd3b814d49d7a3c9e8fdbfb6f2278cbf8bf850db37846ff166c4c
|
|
| MD5 |
a18db330a5620a7128e56cc19a8b99b4
|
|
| BLAKE2b-256 |
c998e3fe520b455e1de54742a39d70957e8660ad381052402ed91782ad2bcf6b
|
Provenance
The following attestation bundles were made for autogluon_assistant-1.0.0.tar.gz:
Publisher:
pypi_release.yml on autogluon/autogluon-assistant
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
autogluon_assistant-1.0.0.tar.gz -
Subject digest:
d22bc743409bd3b814d49d7a3c9e8fdbfb6f2278cbf8bf850db37846ff166c4c - Sigstore transparency entry: 272121429
- Sigstore integration time:
-
Permalink:
autogluon/autogluon-assistant@24dd713a443b654eb031ed76b36808c5b5dfaf10 -
Branch / Tag:
refs/tags/v1.0.0 - Owner: https://github.com/autogluon
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi_release.yml@24dd713a443b654eb031ed76b36808c5b5dfaf10 -
Trigger Event:
release
-
Statement type:
File details
Details for the file autogluon_assistant-1.0.0-py3-none-any.whl.
File metadata
- Download URL: autogluon_assistant-1.0.0-py3-none-any.whl
- Upload date:
- Size: 2.2 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
42b91acd5872d81b8a5071a21a41e9018408e46076f6b0be2db9d88e9a5e491a
|
|
| MD5 |
9e120ca94eaf817b25f3b311c4eea7d9
|
|
| BLAKE2b-256 |
846df55044c256392bc9fa98176020afd7565d2e6bb2bd01bdef7d7a3a8e043e
|
Provenance
The following attestation bundles were made for autogluon_assistant-1.0.0-py3-none-any.whl:
Publisher:
pypi_release.yml on autogluon/autogluon-assistant
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
autogluon_assistant-1.0.0-py3-none-any.whl -
Subject digest:
42b91acd5872d81b8a5071a21a41e9018408e46076f6b0be2db9d88e9a5e491a - Sigstore transparency entry: 272121455
- Sigstore integration time:
-
Permalink:
autogluon/autogluon-assistant@24dd713a443b654eb031ed76b36808c5b5dfaf10 -
Branch / Tag:
refs/tags/v1.0.0 - Owner: https://github.com/autogluon
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi_release.yml@24dd713a443b654eb031ed76b36808c5b5dfaf10 -
Trigger Event:
release
-
Statement type: