Skip to main content

AI CHAT EXTENSION

Project description

Escobar

Escobar was a bad guy.

AI Chat Extension for JupyterLab with comprehensive chat management and WebSocket proxy functionality. It is intended to talk to Bonnie server, located locally at ../bonnie and on GitHub at https://github.com/voitta-ai/bonnie.

The request and response objects are defined in src/types/protocol.ts.

The corresponding objects for Bonnie are defined in ../bonnie/lib/protocol_messages.py (relative to this project).

Overall ecosystem

When running locally, for more ecosystem details see ../README.md. On GitHub see https://github.com/voitta-ai/voitta-ws/blob/master/README.md.

Installation

For development installation:

jupyter labextension develop --overwrite . && pip uninstall escobar -y && pip install -e . && jupyter server extension enable escobar

Features

Configuration:

Set the target WebSocket server in your .env file:

WEBSOCKET_PROXY_TARGET=ws://your-target-server/ws

Usage:

  • Connect to: ws://localhost:8888/ws (your JupyterLab server)
  • Traffic is automatically proxied to your configured target server
  • Supports both ws:// and wss:// protocols
  • Forwards authentication headers (Authorization, Cookie)

Protocol and Message Exchange

Escobar communicates with the Bonnie backend server using a well-defined protocol based on JSON messages exchanged over WebSocket. The protocol message interfaces are defined in src/types/protocol.ts and correspond to Python classes in Bonnie's lib/protocol_messages.py.

Message Structure

Each message is either a request or a response, identified by the message_type field. Messages include a unique call_id for correlating requests and responses. Requests include context identifiers such as username and chatID to maintain session state.

Key message classes include:

  • IListChatsRequest / IListChatsResponse: List available chats for a user.
  • ILoadMessagesRequest / ILoadMessagesResponse: Load messages from a chat.
  • ICreateNewChatRequest / ICreateNewChatResponse: Create a new chat session.
  • IContinueRequest / IContinueResponse: Continue an existing chat session.
  • IUserMessageRequest: Send a user message to the backend.
  • IUserStopRequest: Request to stop the current LLM generation.
  • ISaveSettingsRequest / ISaveSettingsResponse: Save user settings.
  • IRetrieveSettingsRequest / IRetrieveSettingsResponse: Retrieve user settings.
  • IStreamingResponse: Streaming response for LLM content.
  • IToolCallRequest / IToolResponse: Tool call requests and responses for backend tool integrations.

Message Flow

The frontend sends requests to Bonnie, which processes them and sends back responses asynchronously. Streaming responses allow partial LLM outputs to be sent incrementally. Tool call messages enable the backend to request external tool executions.

There are two main chat session flows:

  1. Starting a New Chat:

    • The user sends an ICreateNewChatRequest to create a new chat session.
    • Bonnie responds with ICreateNewChatResponse containing the new chatID.
    • The frontend may optionally send an IContinueRequest to initialize or synchronize session state, but this is not strictly required immediately after creating a new chat.
  2. Continuing an Existing Chat:

    • The user selects an existing chat from the list.
    • The frontend sends an IContinueRequest with the existing chatID.
    • Bonnie responds with IContinueResponse to confirm session continuation.
    • The frontend then sends ILoadMessagesRequest to load chat history.
    • Bonnie responds with ILoadMessagesResponse containing past messages.

Introspection and Voitta Router

Escobar integrates with the Voitta Tool Router, implemented in src/voitta/voittaServer.ts. The router manages the AI prompt and available tools, exposing an introspection interface.

  • Introspection: The router provides a method intraspect() that returns the current prompt and tools in OpenAPI schema format. This allows clients to dynamically discover available tools and their interfaces.
  • Voitta Router: Converts internal tool definitions into OpenAPI specifications compatible with FastAPI, enabling structured API interaction and dynamic tool invocation.

Sequence Diagram

The following sequence diagram illustrates the message exchange between the JupyterLab extension (Escobar), the Voitta router, and the Bonnie backend server, including introspection and tool call flows, with clear separation of new chat creation and existing chat continuation flows.

sequenceDiagram
    participant User as User (JupyterLab)
    participant Escobar as Escobar Extension
    participant Voitta as Voitta Tool Router
    participant Bonnie as Bonnie Backend Server

    %% Existing chat selection flow
    User->>Escobar: IListChatsRequest (listChats)
    Escobar->>Bonnie: IListChatsRequest (listChats)
    Bonnie->>Escobar: IListChatsResponse (listChats)
    Escobar->>User: IListChatsResponse (listChats)

    User->>Escobar: IContinueRequest (continue) [select existing chat]
    Escobar->>Bonnie: IContinueRequest (continue)
    Bonnie->>Escobar: IContinueResponse (continue)
    Escobar->>User: IContinueResponse (continue)

    Escobar->>Bonnie: ILoadMessagesRequest (loadMessages)
    Bonnie->>Escobar: ILoadMessagesResponse (loadMessages)
    Escobar->>User: ILoadMessagesResponse (loadMessages)

    %% New chat creation flow
    User->>Escobar: ICreateNewChatRequest (createNewChat)
    Escobar->>Bonnie: ICreateNewChatRequest (createNewChat)
    Bonnie->>Escobar: ICreateNewChatResponse (createNewChat)
    Escobar->>User: ICreateNewChatResponse (createNewChat)

    %% Optional continue after new chat (not always required)
    Escobar->>Bonnie: IContinueRequest (continue) [optional after new chat]
    Bonnie->>Escobar: IContinueResponse (continue)
    Escobar->>User: IContinueResponse (continue)

    %% Sending user messages
    User->>Escobar: IUserMessageRequest (userMessage)
    Escobar->>Bonnie: IUserMessageRequest (userMessage)
    Bonnie->>Voitta: Processes message, may invoke tools
    Voitta->>Voitta: intraspect() returns prompt and tools (OpenAPI)
    Bonnie->>Escobar: IStreamingResponse or IToolCallRequest
    Escobar->>User: Displays response content

    Bonnie->>Voitta: IToolCallRequest (tool_call)
    Voitta->>Bonnie: IToolResponse (tool_response)
    Bonnie->>Escobar: IToolResponse (tool_response)
    Escobar->>User: Updates UI with tool results

Note on IContinueRequest after new chat

The IContinueRequest sent after a new chat creation is optional and may be used to initialize or synchronize session state. It is not strictly required immediately after creating a new chat, but can help ensure the frontend and backend are fully synchronized before further interactions.

This diagram centralizes the protocol and introspection explanation for the entire system. Bonnie's README.md will reference this diagram for protocol details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

escobar-0.1.94.tar.gz (42.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

escobar-0.1.94-py3-none-any.whl (1.1 MB view details)

Uploaded Python 3

File details

Details for the file escobar-0.1.94.tar.gz.

File metadata

  • Download URL: escobar-0.1.94.tar.gz
  • Upload date:
  • Size: 42.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.12

File hashes

Hashes for escobar-0.1.94.tar.gz
Algorithm Hash digest
SHA256 9b172ef877f135e8cb3b947e90b46e3a0c0fc20ba861423f6e37521ad0f3e64d
MD5 10d991a321cc9a97fb2f2a64fa9c06b8
BLAKE2b-256 5ac8c3ad4fb73a5ea42378bf016a102b7adb4bcd9e0c8dadcb811334c72cdf33

See more details on using hashes here.

File details

Details for the file escobar-0.1.94-py3-none-any.whl.

File metadata

  • Download URL: escobar-0.1.94-py3-none-any.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.12

File hashes

Hashes for escobar-0.1.94-py3-none-any.whl
Algorithm Hash digest
SHA256 5a93d95e1db00b63b42cc495fa92d5daf98a0ece57f86c2ef2fbd3c62562872b
MD5 f726e89ac5e6dfd8292d534d70331c76
BLAKE2b-256 4625989b0bc71c21691ec6f926646b0f8a5703d26aab0ec9832a02bc9d16e69b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page