A web GUI to visually build AI agents and workflows using local LLMs and MCP tool servers
Project description
AI Agentic Hub
A web GUI to visually build AI agents and workflows using local or cloud LLMs and MCP tool servers. No code required — create agents, connect tools, and build complex agentic workflows from a visual editor.
Features
- Multi-Provider LLM Support — connect to Ollama (local), OpenAI, or Anthropic from the same GUI
- MCP Tool Server Management — add MCP servers, auto-discover tools via MCP SDK
- Agent Builder — create/edit/delete agents with custom system prompts, select any LLM and MCP tools
- Agent Chat — chat with agents, see tool calls and results in real time
- Visual Workflow Editor — drag-and-drop DAG editor (Drawflow) to build agentic workflows
- Orchestrator Loop Pattern — an orchestrator agent dynamically routes tasks to specialist agents and loops until done
- Conditional Edges — route workflow paths based on state (e.g.
needs_math == true) - Shared Workflow State — typed state (LangGraph StateGraph) passes between nodes
- ReAct Agents — agents use LangChain/LangGraph ReAct pattern (reason, act, observe, repeat)
- Mix Providers — use different LLMs for different agents in the same workflow (e.g. orchestrator on Claude, workers on local Ollama)
Prerequisites
- Python 3.12+
- uv (recommended) or pip
- Ollama installed and running with a model (e.g.
ollama pull qwen3.5:9b) - (Optional) OpenAI or Anthropic API key for cloud LLMs
- (Optional) An MCP tool server to connect
Quick Start
1. Clone and install
git clone https://github.com/hasanjawad001/ai-agentic-hub.git
cd ai-agentic-hub
uv venv --python 3.12
source .venv/bin/activate
uv pip install -e .
2. Start the example MCP tool server (optional)
In a separate terminal:
source .venv/bin/activate
start-mcp
This starts 8 example tools on port 3000: add, subtract, multiply, divide, reverse_string, uppercase, lowercase, int_to_string.
3. Start the hub
source .venv/bin/activate
start-hub
4. Set up in the GUI
- LLM Servers — Add your LLM server:
- Ollama: provider=
ollama, url=http://localhost:11434, model=qwen3.5:9b - OpenAI: provider=
openai, model=gpt-4o, api_key=sk-... - Anthropic: provider=
anthropic, model=claude-sonnet-4-20250514, api_key=sk-ant-...
- Ollama: provider=
- MCP Servers — Add your MCP server (url:
http://localhost:3000) and click Discover Tools - Agents — Create agents with system prompts and selected tools
- Workflows — Create a workflow, open the editor, drag nodes, connect them, and run
Example: Orchestrator Workflow
Build a workflow where an orchestrator agent dynamically routes tasks to specialists:
Start -> Orchestrator <-> Math Agent (loop back)
<-> Text Agent (loop back)
Orchestrator -> End (when done)
Input: compute ((5+5)/(4-2))*3, convert to string, uppercase it, then reverse it
Result: The orchestrator loops 3 times — sends math to the math agent (add, subtract, divide, multiply = 15), then sends text processing to the text agent (uppercase, reverse = "0.51"), then signals done.
All from the visual editor. No code written.
Project Structure
ai-agentic-hub/
├── backend/
│ ├── main.py # FastAPI app + page routes
│ ├── database.py # SQLite setup
│ ├── models.py # LLMServer, MCPServer, Agent, Workflow
│ ├── api/
│ │ ├── llm_routes.py # LLM server CRUD + health check
│ │ ├── mcp_routes.py # MCP server CRUD + tool discovery
│ │ ├── agent_routes.py # Agent CRUD + chat
│ │ └── workflow_routes.py # Workflow CRUD + run
│ └── services/
│ ├── llm_service.py # LangChain LLM client (Ollama/OpenAI/Anthropic)
│ ├── mcp_service.py # MCP SDK client + LangChain tool conversion
│ ├── agent_service.py # LangGraph ReAct agent
│ └── workflow_service.py # LangGraph StateGraph workflow engine
├── frontend/templates/ # Jinja2 HTML templates (dark theme)
├── examples/
│ └── test_mcp_server.py # Example MCP server with 8 tools
├── pyproject.toml
└── LICENSE
Tech Stack
- Backend: Python, FastAPI, SQLModel, SQLite
- Frontend: Jinja2 templates, vanilla JavaScript
- Agent Engine: LangChain + LangGraph (ReAct agents, StateGraph workflows)
- LLM Providers: langchain-ollama, langchain-openai, langchain-anthropic
- Workflow Editor: Drawflow
- MCP Client: mcp Python SDK
- MCP Server (example): FastMCP
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ai_agentic_hub-0.2.1.tar.gz.
File metadata
- Download URL: ai_agentic_hub-0.2.1.tar.gz
- Upload date:
- Size: 22.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5a12b357e573bfd506d1f074cf68789fb36e4e5cf4bd70a2e28634144d78395c
|
|
| MD5 |
a46cc2808c95f1cb0415c1c2188d5257
|
|
| BLAKE2b-256 |
97a7f136a3fb79367e2c25ea9d67aee8ca6dd7897932b6e2bb1b13d60becfd76
|
File details
Details for the file ai_agentic_hub-0.2.1-py3-none-any.whl.
File metadata
- Download URL: ai_agentic_hub-0.2.1-py3-none-any.whl
- Upload date:
- Size: 34.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5306578022dca6ffde799cd0f6dcf7577053fc6ad61e4677b677315a397f699c
|
|
| MD5 |
15193c5cd77ef1f7dc450ac5fe8f59a8
|
|
| BLAKE2b-256 |
ec0e69b8ebd9840ab4a00e557711f326a9dbfbc01f80e497b68d327535bad8ab
|