Skip to main content

open deep research based on autoagent

Project description

Logo

Auto-Deep-Research: Your Fully-Automated Personal Assistant

Credits Join our Slack community Join our Discord community
Check out the documentation Paper Evaluation Benchmark Score

Welcome to Auto-Deep-Research! Auto-Deep-Research is a open-source and cost-efficient alternative to OpenAI's Deep Research, based on AutoAgent (formerly MetaChain) framework.

✨Key Features

  • 🏆 High Performance: Ranks the #1 spot among open-sourced methods, delivering comparable performance to OpenAI's Deep Research.
  • 🌐 Universal LLM Support: Seamlessly integrates with A Wide Range of LLMs (e.g., OpenAI, Anthropic, Deepseek, vLLM, Grok, Huggingface ...)
  • 🔀 Flexible Interaction: Supports both function-calling and non-function-calling interaction LLMs.
  • 💰 Cost-Efficient: Open-source alternative to Deep Research's $200/month subscription with your own pay-as-you-go LLM API keys.
  • 📁 File Support: Handles file uploads for enhanced data interaction
  • 🚀 One-Click Launch: Get started instantly with a simple auto deep-research command - Zero Configuration needed, truly out-of-the-box experience.

🚀 Own your own personal assistant with much lower cost. Try 🔥Auto-Deep-Research🔥 Now!

🔥 News

  • [2025, Feb 16]:  🎉🎉We've cleaned up the codebase of [AutoAgent](https://github.com/HKUDS/AutoAgent), removed the irrelevant parts for Auto-Deep-Research and released the first version of Auto-Deep-Research.

📑 Table of Contents

🧐 Why to release Auto-Deep-Research?

After releasing AutoAgent (previously known as MetaChain) for a week, we've observed three compelling reasons to introduce Auto-Deep-Research:

  1. Community Interest
    We noticed significant community interest in our Deep Research alternative functionality. In response, we've streamlined the codebase by removing non-Deep-Research related components to create a more focused tool.

  2. Framework Extensibility
    Auto-Deep-Research serves as the first ready-to-use product built on AutoAgent, demonstrating how quickly and easily you can create powerful Agent Apps using our framework.

  3. Community-Driven Improvements
    We've incorporated valuable community feedback from the first week, introducing features like one-click launch and enhanced LLM compatibility to make the tool more accessible and versatile.

Auto-Deep-Research represents our commitment to both the community's needs and the demonstration of AutoAgent's potential as a foundation for building practical AI applications.

⚡ Quick Start

Installation

Auto-Deep-Research Installation

conda create -n auto_deep_research python=3.10
conda activate auto_deep_research
git clone https://github.com/HKUDS/Auto-Deep-Research.git
cd Auto-Deep-Research
pip install -e .

Docker Installation

We use Docker to containerize the agent-interactive environment. So please install Docker first. You don't need to manually pull the pre-built image, because we have let Auto-Deep-Research automatically pull the pre-built image based on your architecture of your machine.

API Keys Setup

Create a environment variable file, just like .env.template, and set the API keys for the LLMs you want to use. Not every LLM API Key is required, use what you need.

Start Auto-Deep-Research

Command Options:

You can run auto deep-research to start Auto-Deep-Research. Some configuration of this command is shown below.

  • --container_name: Name of the Docker container (default: 'deepresearch')
  • --port: Port for the container (default: 12346)
  • COMPLETION_MODEL: Specify the LLM model to use, you should follow the name of Litellm to set the model name. (Default: claude-3-5-sonnet-20241022)
  • DEBUG: Enable debug mode for detailed logs (default: False)
  • API_BASE_URL: The base URL for the LLM provider (default: None)
  • FN_CALL: Enable function calling (default: None). Most of time, you could ignore this option because we have already set the default value based on the model name.

Different LLM Providers

We will show you how easy it is to start Auto-Deep-Research with different LLM providers.

Anthropic
  • set the ANTHROPIC_API_KEY in the .env file.
ANTHROPIC_API_KEY=your_anthropic_api_key
  • run the following command to start Auto-Deep-Research.
auto deep-research # default model is claude-3-5-sonnet-20241022
OpenAI
  • set the OPENAI_API_KEY in the .env file.
OPENAI_API_KEY=your_openai_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=gpt-4o auto deep-research
Mistral
  • set the MISTRAL_API_KEY in the .env file.
MISTRAL_API_KEY=your_mistral_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=mistral/mistral-large-2407 auto deep-research
Gemini - Google AI Studio
  • set the GEMINI_API_KEY in the .env file.
GEMINI_API_KEY=your_gemini_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=gemini/gemini-2.0-flash auto deep-research
Huggingface
  • set the HUGGINGFACE_API_KEY in the .env file.
HUGGINGFACE_API_KEY=your_huggingface_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=huggingface/meta-llama/Llama-3.3-70B-Instruct auto deep-research
Groq
  • set the GROQ_API_KEY in the .env file.
GROQ_API_KEY=your_groq_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=groq/deepseek-r1-distill-llama-70b auto deep-research
OpenAI-Compatible Endpoints (e.g., Grok)
  • set the OPENAI_API_KEY in the .env file.
OPENAI_API_KEY=your_api_key_for_openai_compatible_endpoints
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=openai/grok-2-latest API_BASE_URL=https://api.x.ai/v1 auto deep-research
OpenRouter (e.g., DeepSeek-R1)

We recommend using OpenRouter as LLM provider of DeepSeek-R1 temporarily. Because official API of DeepSeek-R1 can not be used efficiently.

  • set the OPENROUTER_API_KEY in the .env file.
OPENROUTER_API_KEY=your_openrouter_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=openrouter/deepseek/deepseek-r1 auto deep-research
DeepSeek
  • set the DEEPSEEK_API_KEY in the .env file.
DEEPSEEK_API_KEY=your_deepseek_api_key
  • run the following command to start Auto-Deep-Research.
COMPLETION_MODEL=deepseek/deepseek-chat auto deep-research

Tips

Import browser cookies to browser environment

You can import the browser cookies to the browser environment to let the agent better access some specific websites. For more details, please refer to the cookies folder.

More features coming soon! 🚀 Web GUI interface under development.

☑️ Todo List

Auto-Deep-Research is continuously evolving! Here's what's coming:

  • 🖥️ GUI Agent: Supporting Computer-Use agents with GUI interaction
  • 🏗️ Code Sandboxes: Supporting additional environments like E2B
  • 🎨 Web Interface: Developing comprehensive GUI for better user experience

Have ideas or suggestions? Feel free to open an issue! Stay tuned for more exciting updates! 🚀

📖 Documentation

A more detailed documentation is coming soon 🚀, and we will update in the Documentation page.

🤝 Join the Community

If you think the Auto-Deep-Research is helpful, you can join our community by:

🙏 Acknowledgements

Rome wasn't built in a day. Auto-Deep-Research is built on the AutoAgent framework. We extend our sincere gratitude to all the pioneering works that have shaped AutoAgent, including OpenAI Swarm for framework architecture inspiration, Magentic-one for the three-agent design insights, OpenHands for documentation structure, and many other excellent projects that contributed to agent-environment interaction design. Your innovations have been instrumental in making both AutoAgent and Auto-Deep-Research possible.

🌟 Cite


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

auto_deep_research-0.1.0.tar.gz (35.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

auto_deep_research-0.1.0-py3-none-any.whl (39.5 kB view details)

Uploaded Python 3

File details

Details for the file auto_deep_research-0.1.0.tar.gz.

File metadata

  • Download URL: auto_deep_research-0.1.0.tar.gz
  • Upload date:
  • Size: 35.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for auto_deep_research-0.1.0.tar.gz
Algorithm Hash digest
SHA256 45ece2ed8fb3762613c3a35c622d37dbacf7244064bd293265b1f55f677c3997
MD5 0eda23097e4ea6789f7a2cf0a87ae5c5
BLAKE2b-256 602b3442e84e1abc8dbb8ab84d4b45d162c4529af18406a4a3673d67a53f2cf4

See more details on using hashes here.

File details

Details for the file auto_deep_research-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for auto_deep_research-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 623f3954682fc5bb0fa34572fe02753bd02c68e24cba8880104a2fff18098190
MD5 a7a5b2847ccc749d3dd00379b136cd71
BLAKE2b-256 08969fc7f361741f747aa9e52fc3dfd953b4c37f3f3606b5d59863a595cc8988

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page