Skip to main content

A web GUI to visually build AI agents and workflows using local LLMs and MCP tool servers

Project description

AI Agentic Hub

A web GUI to visually build AI agents and workflows using local or cloud LLMs and MCP tool servers. No code required — create agents, connect tools, and build complex agentic workflows from a visual editor.

Features

  • Multi-Provider LLM Support — connect to Ollama (local), OpenAI, or Anthropic from the same GUI
  • MCP Tool Server Management — add MCP servers, auto-discover tools via MCP SDK
  • Agent Builder — create/edit/delete agents with custom system prompts, select any LLM and MCP tools
  • Agent Chat — chat with agents, see tool calls and results in real time
  • Visual Workflow Editor — drag-and-drop DAG editor (Drawflow) to build agentic workflows
  • Orchestrator Loop Pattern — an orchestrator agent dynamically routes tasks to specialist agents and loops until done
  • Conditional Edges — route workflow paths based on state (e.g. needs_math == true)
  • Shared Workflow State — typed state (LangGraph StateGraph) passes between nodes
  • ReAct Agents — agents use LangChain/LangGraph ReAct pattern (reason, act, observe, repeat)
  • Mix Providers — use different LLMs for different agents in the same workflow (e.g. orchestrator on Claude, workers on local Ollama)

Prerequisites

  • Python 3.12+
  • uv (recommended) or pip
  • Ollama installed and running with a model (e.g. ollama pull qwen3.5:9b)
  • (Optional) OpenAI or Anthropic API key for cloud LLMs
  • (Optional) An MCP tool server to connect

Quick Start

1. Clone and install

git clone https://github.com/hasanjawad001/ai-agentic-hub.git
cd ai-agentic-hub
uv venv --python 3.12
source .venv/bin/activate
uv pip install -e .

2. Start the example MCP tool server (optional)

In a separate terminal:

source .venv/bin/activate
start-mcp

This starts 8 example tools on port 3000: add, subtract, multiply, divide, reverse_string, uppercase, lowercase, int_to_string.

3. Start the hub

source .venv/bin/activate
start-hub

Open http://localhost:8000

4. Set up in the GUI

  1. LLM Servers — Add your LLM server:
    • Ollama: provider=ollama, url=http://localhost:11434, model=qwen3.5:9b
    • OpenAI: provider=openai, model=gpt-4o, api_key=sk-...
    • Anthropic: provider=anthropic, model=claude-sonnet-4-20250514, api_key=sk-ant-...
  2. MCP Servers — Add your MCP server (url: http://localhost:3000) and click Discover Tools
  3. Agents — Create agents with system prompts and selected tools
  4. Workflows — Create a workflow, open the editor, drag nodes, connect them, and run

Example: Orchestrator Workflow

Build a workflow where an orchestrator agent dynamically routes tasks to specialists:

Start -> Orchestrator <-> Math Agent (loop back)
                      <-> Text Agent (loop back)
         Orchestrator -> End (when done)

Input: compute ((5+5)/(4-2))*3, convert to string, uppercase it, then reverse it

Result: The orchestrator loops 3 times — sends math to the math agent (add, subtract, divide, multiply = 15), then sends text processing to the text agent (uppercase, reverse = "0.51"), then signals done.

All from the visual editor. No code written.

Project Structure

ai-agentic-hub/
├── backend/
│   ├── main.py                 # FastAPI app + page routes
│   ├── database.py             # SQLite setup
│   ├── models.py               # LLMServer, MCPServer, Agent, Workflow
│   ├── api/
│   │   ├── llm_routes.py       # LLM server CRUD + health check
│   │   ├── mcp_routes.py       # MCP server CRUD + tool discovery
│   │   ├── agent_routes.py     # Agent CRUD + chat
│   │   └── workflow_routes.py  # Workflow CRUD + run
│   └── services/
│       ├── llm_service.py      # LangChain LLM client (Ollama/OpenAI/Anthropic)
│       ├── mcp_service.py      # MCP SDK client + LangChain tool conversion
│       ├── agent_service.py    # LangGraph ReAct agent
│       └── workflow_service.py # LangGraph StateGraph workflow engine
├── frontend/templates/         # Jinja2 HTML templates (dark theme)
├── examples/
│   └── test_mcp_server.py      # Example MCP server with 8 tools
├── pyproject.toml
└── LICENSE

Tech Stack

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_agentic_hub-0.2.0.tar.gz (22.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_agentic_hub-0.2.0-py3-none-any.whl (33.7 kB view details)

Uploaded Python 3

File details

Details for the file ai_agentic_hub-0.2.0.tar.gz.

File metadata

  • Download URL: ai_agentic_hub-0.2.0.tar.gz
  • Upload date:
  • Size: 22.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for ai_agentic_hub-0.2.0.tar.gz
Algorithm Hash digest
SHA256 4188d065844b939e416128e7202766af6beeec0567e5d17337a8eb6daada4680
MD5 2d1fea66c8b28d633c7f5d566a6f8aac
BLAKE2b-256 d085a9d3f740359aec509a1052b096187a38a2305d66139e721c279b6f6febd0

See more details on using hashes here.

File details

Details for the file ai_agentic_hub-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: ai_agentic_hub-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 33.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for ai_agentic_hub-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e0520a16f7a4352359d30f376c943cfa5d7fbefe6726cb5ecf769c4aad7fbcef
MD5 b199e9ac407f24e7f52c889c27682954
BLAKE2b-256 99ebcd6703fbf763e7e3a638887156a42627449907b6882e42741acaa8f45bdf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page