Skip to main content

Full-stack AI agent with Python backend and Vue frontend

Project description

IriBot - Lightweight AI Agent Chat System

A full-featured AI agent application with tool calling capabilities and real-time conversation experience. Built with Python FastAPI backend + Vue 3 frontend full-stack architecture.

🚀 Quick Start

  • Install: pip install iridet-bot
  • Find an empty directory for the agent, and run: iribot
  • To specify host and port: iribot --host 0.0.0.0 --port 8080
  • (Optional, Recommended) Copy some skills to skills directory

✨ Key Features

🤖 AI Agent Conversation

  • Intelligent conversation powered by OpenAI API
  • Streaming response support for real-time AI replies
  • Image input support (vision capabilities)
  • Customizable system prompts

🛠️ Tool Calling System

The agent can autonomously call the following tools to complete tasks:

  • File Operations

    • read_file - Read file contents
    • write_file - Create or modify files
    • list_directory - List directory contents
  • Command Execution

    • shell_start - Start an interactive shell session
    • shell_run - Execute commands in shell
    • shell_read - Read shell output
    • shell_write - Write input to shell
    • shell_stop - Stop shell session

💬 Session Management

  • Multi-session support, create multiple independent conversations
  • Persistent session history storage
  • Session list management (create, switch, delete)
  • Independent system prompts for each session

🎨 Modern UI

  • Beautiful interface based on TDesign component library
  • Real-time tool call status display
  • Markdown message rendering support
  • Responsive design for different screen sizes

🏗️ System Architecture

graph TB
    subgraph Frontend["Frontend Layer"]
        A[ChatSidebar<br/>Session List]
        B[ChatContainer<br/>Chat View]
        C[ToolCallMessage<br/>Tool Call View]
        FE[Vue 3 + TDesign UI + Vite]
    end

    subgraph Backend["Backend Layer"]
        D[main.py<br/>FastAPI Server]
        E[agent.py<br/>AI Agent]
        F[executor.py<br/>Tool Executor]
        G[session_manager.py<br/>Session State Management]
        H[tools/<br/>Tool Suite]
        BE[FastAPI + OpenAI SDK]
    end

    subgraph External["External Services"]
        I[OpenAI API / Compatible LLM Service<br/>GPT-4, GPT-3.5, or Custom Models]
    end

    Frontend -->|HTTP/SSE<br/>Server-Sent Events| Backend
    Backend -->|OpenAI API| External

    D --> E
    E --> F
    E --> G
    F --> H

    style Frontend fill:#e1f5ff
    style Backend fill:#fff4e1
    style External fill:#f0f0f0

Data Flow

sequenceDiagram
    participant User
    participant Frontend
    participant SessionManager
    participant Agent
    participant ToolExecutor
    participant Tools
    participant OpenAI

    User->>Frontend: Input Message
    Frontend->>SessionManager: POST /api/chat/stream
    SessionManager->>SessionManager: Save user message
    SessionManager->>Agent: Forward message
    Agent->>OpenAI: Call OpenAI API

    alt Text Response
        OpenAI-->>Agent: Stream text content
        Agent-->>Frontend: SSE stream
        Frontend-->>User: Display in real-time
    end

    alt Tool Call Request
        OpenAI-->>Agent: Return tool call request
        Agent->>ToolExecutor: Execute tool
        ToolExecutor->>Tools: Call specific tool

        alt File Operations
            Tools->>Tools: Read/Write file system
        end

        alt Shell Commands
            Tools->>Tools: Execute shell commands
        end

        Tools-->>ToolExecutor: Return result
        ToolExecutor-->>Agent: Tool execution result
        Agent->>OpenAI: Send tool result
        OpenAI-->>Agent: Continue generating response
        Agent-->>Frontend: SSE stream
        Frontend-->>User: Display response
    end

🚀 Quick Start

Requirements

  • Python 3.8+
  • Node.js 16+
  • OpenAI API Key (or compatible LLM service)

Installation

1. Clone the Repository

git clone <repository-url>
cd mybot

2. Backend Setup

cd iribot

# Create virtual environment (recommended)
python -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Configure environment variables
cp .env.example .env
# Edit .env file and add your OpenAI API Key

.env configuration example:

OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
OPENAI_MODEL=gpt-4-turbo-preview
# OPENAI_BASE_URL=https://api.openai.com/v1  # Optional, use custom API endpoint
DEBUG=false

3. Frontend Setup

cd frontend

# Install dependencies
npm install

4. Start Services

Using Automated Scripts (Recommended)

Windows:

# In project root directory
./setup.bat

Linux/macOS:

# In project root directory
chmod +x setup.sh
./setup.sh
Manual Start

Backend:

cd iribot
uvicorn main:app --reload --port 8000

Frontend:

cd frontend
npm run dev

🔧 Configuration

Backend Configuration

Configure in iribot/.env file:

Config Item Description Default
OPENAI_API_KEY OpenAI API key Required
OPENAI_MODEL Model to use gpt-4-vision-preview
OPENAI_BASE_URL Custom API endpoint Empty (use official)
DEBUG Debug mode false
BASH_PATH Bash executable path bash

Frontend Configuration

Frontend connects to backend via Vite proxy. Configuration file: frontend/vite.config.js

export default {
  server: {
    proxy: {
      "/api": {
        target: "http://localhost:8000",
        changeOrigin: true,
      },
    },
  },
};

🔌 API Endpoints

Session Management

  • POST /api/sessions - Create new session
  • GET /api/sessions - Get session list
  • GET /api/sessions/{session_id} - Get session details
  • DELETE /api/sessions/{session_id} - Delete session

Chat Interface

  • POST /api/chat/stream - Send message (SSE streaming response)

Tool Status

  • GET /api/tools/status - Get all tool statuses

🛠️ Extension Development

Adding New Tools

  1. Create a new tool file in the iribot/tools/ directory
  2. Inherit from BaseTool class:
from tools.base import BaseTool

class MyCustomTool(BaseTool):
    @property
    def name(self) -> str:
        return "my_custom_tool"

    @property
    def description(self) -> str:
        return "Tool description"

    @property
    def parameters(self) -> dict:
        return {
            "type": "object",
            "properties": {
                "param1": {
                    "type": "string",
                    "description": "Parameter description"
                }
            },
            "required": ["param1"]
        }

    def execute(self, **kwargs) -> dict:
        # Implement tool logic
        return {
            "success": True,
            "result": "Execution result"
        }
  1. Register the tool in executor.py:
def _register_default_tools(self):
    # ... other tools
    self.register_tool(MyCustomTool())

Adding New Frontend Components

Add tool call visualization components in the frontend/src/components/tool-calls/ directory.

📝 Tech Stack

Backend

  • FastAPI - Modern, fast web framework
  • OpenAI SDK - LLM interface calling
  • Pydantic - Data validation and settings management
  • Uvicorn - ASGI server

Frontend

  • Vue 3 - Progressive JavaScript framework
  • TDesign - Enterprise-level UI component library
  • Vite - Next-generation frontend build tool
  • Marked - Markdown parser

🤝 Contributing

Issues and Pull Requests are welcome!

📄 License

MIT License

🔗 Related Links


Note: Using this project requires a valid OpenAI API Key or compatible LLM service endpoint.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iridet_bot-0.1.1a4.tar.gz (3.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iridet_bot-0.1.1a4-py3-none-any.whl (3.5 MB view details)

Uploaded Python 3

File details

Details for the file iridet_bot-0.1.1a4.tar.gz.

File metadata

  • Download URL: iridet_bot-0.1.1a4.tar.gz
  • Upload date:
  • Size: 3.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for iridet_bot-0.1.1a4.tar.gz
Algorithm Hash digest
SHA256 65f9e87ff81f541ecfad2d446763e70df28a99d57618cd64195e6598c0aa86e1
MD5 4523591b53d7c00644bc6861a84accde
BLAKE2b-256 12d64db614a71c78ca17bd6cc2c174a2ac4286839ef1f8a719c000943efb18c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for iridet_bot-0.1.1a4.tar.gz:

Publisher: pypi.yml on fyc09/iribot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file iridet_bot-0.1.1a4-py3-none-any.whl.

File metadata

  • Download URL: iridet_bot-0.1.1a4-py3-none-any.whl
  • Upload date:
  • Size: 3.5 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for iridet_bot-0.1.1a4-py3-none-any.whl
Algorithm Hash digest
SHA256 9b4c651cbc861d5229a5b372103cf82dbe8340f967f61625750586582442df17
MD5 02dbd48811a29b5a65885eb81c7f5762
BLAKE2b-256 c475574a6e5c2afbdb071f4b934b67fc2a1336d44c8052f9fd6efd1284d5942b

See more details on using hashes here.

Provenance

The following attestation bundles were made for iridet_bot-0.1.1a4-py3-none-any.whl:

Publisher: pypi.yml on fyc09/iribot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page