Skip to main content

A modern Python library for building Google Chat bots with serverless-safe async processing.

Project description

Google Chat Bot Library (gchatbot)

A modern Python library for building Google Chat bots, leveraging FastAPI for high performance and native asynchronous support.

Overview

This library provides a robust base class (GChatBot) that handles the complexities of the Google Chat API, allowing you to focus on your bot's logic.

  • Serverless-Safe Architecture: Designed from the ground up to work reliably in serverless environments like Google Cloud Run and AWS Lambda using FastAPI's BackgroundTasks.
  • Hybrid Sync/Async Support: Automatically detects whether your processing methods are sync or async and handles them correctly.
  • Progressive Responses: Provide immediate feedback to the user for long-running tasks and then update the message with the final result.
  • Modular Architecture: Decoupled components like EventParser, AsyncProcessor, and ResponseFactory allow for advanced customization.
  • Simplified Event Parsing: Automatically converts the various Google Chat event payloads into a clean, predictable ExtractedEventData structure.
  • Fully Typed: Complete type hints for a superior developer experience and robust code.

How It Works: The Serverless-Safe Model

The library implements a processing model that is both efficient and safe for serverless environments, preventing premature termination of background jobs.

  1. Request Received: The FastAPI endpoint receives an event from the Google Chat API.
  2. Delegate to Handler: The request is passed to the GChatBot.handleRequest method, along with FastAPI's backgroundTasks object.
  3. Parse and Route: The event payload is parsed into a clean ExtractedEventData object. The library checks if the target handler (_processMessage or _processSlashCommand) is async or sync.
  4. Execution:
    • async handlers are awaited directly.
    • sync handlers are run in a separate thread (asyncio.to_thread) to avoid blocking the event loop.
  5. Response Handling:
    • Simple Response: If the handler returns a string, it is sent back immediately as a synchronous JSON response.
    • Progressive Response: If the handler returns a tuple (quick_message, detailed_coroutine): a. The quick_message is sent back immediately as a synchronous JSON response. b. The detailed_coroutine is added to FastAPI's backgroundTasks via background_tasks.add_task. FastAPI guarantees the execution of this task, even after the response has been sent.

This model ensures that long-running background jobs are not terminated prematurely by the serverless environment's lifecycle.

Flow Diagram

graph TD
    subgraph "Google Chat"
        A[Webhook Request]
    end

    subgraph "FastAPI Application"
        B(Route Endpoint)
        C{handleRequest}
    end

    subgraph "GChatBot - Phase 1: Analysis & Routing"
        D[Parser: extractEventData]
        E{Is handler async or sync?}
        F[Run sync handler in thread]
        G[Run async handler directly]
        H(Result: ResponseType)
    end
    
    subgraph "GChatBot - Phase 2: Response Delivery"
        I{Is result a<br/>Progressive Response?}
        J[Factory: format simple response]
        K[Return HTTP 200 OK with JSON]
        L[Factory: format quick response]
        M[Return HTTP 200 OK with JSON]
        N[<b>background_tasks.add_task</b>]
    end

    subgraph "Background Process (Guaranteed by FastAPI)"
        O(Processor: runAndAndUpdate)
        P[Create asyncio.Task from coroutine]
        Q(Processor: handleAsyncResponse)
        R[API Call: Create 'Processing' msg]
        S[Await the Task's completion]
        T[API Call: Update msg with final result]
    end

    A --> B --> C
    C --> D
    D --> E
    E -- Sync --> F
    E -- Async --> G
    F --> H
    G --> H
    H --> I

    I -- No --> J
    J --> K

    I -- Yes --> L
    L --> M
    I -- Yes --> N

    N -.-> O
    O --> P
    P --> Q
    Q --> R
    R --> S
    S --> T
    
    style K fill:#d4edda,stroke:#155724
    style M fill:#d4edda,stroke:#155724
    style N fill:#ffeeba,stroke:#856404
    style T fill:#d4edda,stroke:#155724

Installation

# Install the library with FastAPI dependencies (Recommended)
pip install "gchatbot[fastapi]"

Recommended Usage: FastAPI Example

# example.py
import os
import asyncio
from typing import Any, Dict
from fastapi import FastAPI, Request, BackgroundTasks
from gchatbot import GChatBot, ExtractedEventData, EventPayload, ResponseType

# Ensure you have a 'service.json' file or set the environment variable.
SERVICE_ACCOUNT_FILE: str = os.environ.get("SERVICE_ACCOUNT_FILE", "service.json")

class BotExample(GChatBot):
    """
    Example bot demonstrating synchronous and asynchronous methods.
    """
    def __init__(self) -> None:
        super().__init__(
            botName="Example Bot",
            serviceAccountFile=SERVICE_ACCOUNT_FILE
        )

    async def _processSlashCommand(self, command: str, arguments: str, extractedData: ExtractedEventData, eventData: EventPayload) -> ResponseType:
        """
        Asynchronous method to process slash commands.
        """
        user: str = extractedData.get('userDisplayName', 'User')

        if command == "report":
            # Progressive response for a long-running async task
            quickResponse = f"📊 Understood, {user}. Generating your report, this may take a moment..."
            
            async def detailedResponse() -> str:
                await asyncio.sleep(8)  # Simulate long data processing
                return f"✅ Report for {user} is complete! You can view it here: [link]"
            
            return (quickResponse, detailedResponse())
        
        else:
            await asyncio.sleep(1)
            return f"✅ ASYNC command `/{command}` executed for {user}."

    def _processMessage(self, text: str, extractedData: ExtractedEventData, eventData: EventPayload) -> ResponseType:
        """
        Synchronous method to process regular messages.
        """
        user: str = extractedData.get('userDisplayName', 'User')
        
        return f"💬 SYNC message processed for {user}: '{text}'"

# --- FastAPI App Setup ---
app = FastAPI(title="Google Chat Bot Example")
bot = BotExample()

@app.post("/webhook")
async def handleEvent(request: Request, backgroundTasks: BackgroundTasks) -> Any:
    """Entry point for all Google Chat events."""
    return await bot.handleRequest(request, backgroundTasks)

@app.get("/")
def home() -> Dict[str, str]:
    """Health check endpoint."""
    return {"status": "active", "bot_name": bot.botName}

# To run locally: uvicorn example:app --reload --port 8080

Changelog

Version 0.3.0 - Serverless Architecture Refactor

This version introduces a major architectural refactor to ensure robust performance in serverless environments and establishes new code quality standards.

💥 Breaking Changes

  • Code Standard: camelCase: All method parameters and TypedDict keys have been standardized to camelCase (e.g., event_data is now eventData, background_tasks is backgroundTasks). You must update your method signatures when upgrading.
  • Code Standard: English Language: All docstrings, comments, log messages, and default user-facing strings have been translated to English.

✨ New Features & Enhancements

  • Serverless-Safe Architecture: The core handleRequest logic has been rebuilt to use FastAPI's BackgroundTasks. This guarantees that long-running processes (Progressive Responses) are not terminated prematurely in serverless environments like Google Cloud Run or AWS Lambda.
  • Timeout Logic Removed: The old syncTimeout logic has been removed in favor of the more robust BackgroundTasks model. The bot now responds immediately with a quick message and schedules the long task, which is a more reliable pattern.
  • Codebase Standardization: The entire library codebase now follows stricter standards for docstrings, typing, and naming conventions.

(Older changelog entries below)

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gchatbot-0.3.0.tar.gz (45.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gchatbot-0.3.0-py3-none-any.whl (31.5 kB view details)

Uploaded Python 3

File details

Details for the file gchatbot-0.3.0.tar.gz.

File metadata

  • Download URL: gchatbot-0.3.0.tar.gz
  • Upload date:
  • Size: 45.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for gchatbot-0.3.0.tar.gz
Algorithm Hash digest
SHA256 365996d07a2149a99fdd91b9c2687898fc9d96f8a58474fe39a4021e5f0e9284
MD5 08809c686a98462019dcf636c3d0109f
BLAKE2b-256 312f97e57b0a653ab9f22817a6a8bc30079d03010e9de554eddace233717ae7a

See more details on using hashes here.

File details

Details for the file gchatbot-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: gchatbot-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 31.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for gchatbot-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 dab8fe55f1857111193544b4bc692ac9cc13825d394d2df156270e8ec731e7cc
MD5 762856167cfe0d0a95ab3751f1002591
BLAKE2b-256 a619f6cfa6e565193299cb11b4cfc9804401e6a1ecb17f8b55319a8a0132b69e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page