Backend.AI AppProxy Worker
Project description
Backend.AI App Proxy Worker
Purpose
The App Proxy Worker is a high-performance reverse proxy that routes user traffic to compute session services (Jupyter, SSH, TensorBoard, etc.) running on agents. It receives routing information from the Coordinator and handles SSL/TLS termination, load balancing, and traffic forwarding.
Key Responsibilities
1. Traffic Proxying
- Proxy HTTP/HTTPS requests to session services
- Proxy WebSocket connections for interactive services
- Handle SSL/TLS termination
- Stream responses efficiently
2. Route Resolution
- Receive routing tables from Coordinator
- Resolve session services from URLs
- Cache routing information locally
- Update routes dynamically
3. Health Checking
- Monitor backend service health
- Detect failed services
- Report health status to Coordinator
- Handle service failover
Architecture
1. Traffic Proxy (Main)
Framework: aiohttp + custom reverse proxy
Port: 5050 (default, HTTPS)
Protocol: HTTP/HTTPS, WebSocket
Key Features:
HTTP/HTTPS Proxy
- Route user requests to session services
- URL Pattern:
https://<worker-domain>/<session-id>/<service-name>/...
WebSocket Proxy
- Interactive service communication (Jupyter Kernel, SSH, etc.)
- Real-time log streaming
Key Characteristics:
- SSL/TLS termination (Let's Encrypt auto-certificate)
- High-performance async proxy
- Connection pooling and reuse
- Streaming support (large file downloads)
- Sticky session support
- Auto-retry and failover
Processing Flow:
HTTP Proxy Flow
User → HTTPS Request → Worker (SSL termination)
↓
Parse URL (extract session_id, service_name)
↓
Lookup route from local cache
↓
Resolve backend address (agent:port)
↓
Proxy request to agent
↓
Stream response back to user
WebSocket Proxy Flow
User → WS Upgrade Request → Worker
↓
Establish WS connection to agent
↓
Bidirectional message forwarding
2. REST API (Management)
Framework: aiohttp (async HTTP server)
Port: 6040 (default, separate management port)
Key Features:
- Communication with Coordinator
- Health check endpoints
- Metrics exposure (Prometheus)
- Internal management (no external access)
Component Interaction
Traffic Proxy Flow:
User (Browser) → Worker (Port 5050) → Kernel (on Agent)
│
├─ SSL/TLS termination
├─ Route resolution
└─ Traffic proxying
Management Flow:
Coordinator → Worker REST API (Port 6040) → Route updates
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file backend_ai_appproxy_worker-25.17.0rc2.tar.gz.
File metadata
- Download URL: backend_ai_appproxy_worker-25.17.0rc2.tar.gz
- Upload date:
- Size: 41.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e1ec1df9c1d511228d6bcfcecca172e4cbfa00f175ec9581297275f42edf0880
|
|
| MD5 |
cd70733c098f48f19c7cce536eff181e
|
|
| BLAKE2b-256 |
1ccb9aa5cf12320475f15f4ee4f2ea637d79f737a2ea710446f2b50e671e52ec
|
File details
Details for the file backend_ai_appproxy_worker-25.17.0rc2-py3-none-any.whl.
File metadata
- Download URL: backend_ai_appproxy_worker-25.17.0rc2-py3-none-any.whl
- Upload date:
- Size: 53.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
37bf879aa4513afd751ac3ef9de2f1b4f1a10aecb86da8d106c5fb1504590667
|
|
| MD5 |
422abc651b5b2782dff804043eab7b9b
|
|
| BLAKE2b-256 |
c802bbfa54ce78b986f61683b6fb21245699a062b4888cf7df7653d5e2e7cb56
|