OpenAI-compatible LLM provider for exoclaw with streaming request bodies, per-model routing, and CPython + MicroPython support
Project description
exoclaw-provider-openai
Direct-httpx OpenAI-compatible provider for exoclaw.
Uses httpx directly (no OpenAI SDK, no LiteLLM) so the request body can be
streamed as the JSON is built rather than materialized as one contiguous
string before the POST. On long conversations this drops peak per-turn RAM
from ≈3× prompt-size (list + JSON dump + httpx buffer) to ≈1× streaming
buffer — see exoclaw/docs/memory-model.md Step B.
Shape
from exoclaw_provider_openai import Deployment, OpenAIStreamingProvider
provider = OpenAIStreamingProvider(
default_model="zai/glm-4.7",
deployments={
"zai/glm-4.7": Deployment(base_url="https://openrouter.ai/api/v1", api_key=OPENROUTER_KEY),
"minimax/minimax-m2.7": Deployment(base_url="https://openrouter.ai/api/v1", api_key=OPENROUTER_KEY),
"openai/gpt-5.4": Deployment(base_url="https://api.openai.com/v1", api_key=OPENAI_KEY),
"zai-direct/glm-5.1": Deployment(base_url="https://api.z.ai/api/coding/paas/v4", api_key=ZAI_KEY),
},
fallbacks={
"zai/glm-4.7": ["minimax/minimax-m2.7"],
"zai/glm-5.1": ["minimax/minimax-m2.7"],
},
)
Each model name maps to exactly one deployment (base URL + API key + optional
extra headers). fallbacks is a per-model list — when the primary raises a
retryable error (429, 5xx, timeout) the provider walks the fallback list
until one succeeds or all are exhausted.
Routing policy
- Each model → exactly one deployment. No load balancing across deployments for a single model.
- Fallbacks are strict per-model lists. No automatic cross-provider fallback.
- SSE response streaming is always on (the streaming-request-body path requires it as the codec; non-streaming responses aren't supported).
- TTFT timeout: if the first response byte doesn't arrive inside
stream_ttft_timeoutseconds, the request is abandoned and the fallback list is tried. Default 15 s.
What's intentionally skipped vs. LiteLLM
- Multi-provider cost tracking / usage normalization (OpenRouter returns the OpenAI shape directly; openclaw doesn't use cost tracking).
- Automatic Anthropic cache_control tagging. If your deployment needs it,
stamp it on the messages before handing them to
chat()— this provider is OpenAI-schema only. - Deep parameter remapping across providers. Everything here is the OpenAI chat-completions schema; it's the caller's responsibility to send messages in that shape.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file exoclaw_provider_openai-0.4.2.tar.gz.
File metadata
- Download URL: exoclaw_provider_openai-0.4.2.tar.gz
- Upload date:
- Size: 17.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
65dc93c14aacd7a93f12515ef2f912e2342aee2581affba42bcbbf9a639a99e5
|
|
| MD5 |
e31a4d2e7ed7bd9a25964e4099d37d65
|
|
| BLAKE2b-256 |
36271b10fc42bef361f80c44d792fe6862e2aa997ee1d6a7febfc1180b8398cc
|
File details
Details for the file exoclaw_provider_openai-0.4.2-py3-none-any.whl.
File metadata
- Download URL: exoclaw_provider_openai-0.4.2-py3-none-any.whl
- Upload date:
- Size: 12.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fd866022ba87bde601d309d3d62c4b4b2551bea96b393bc8a66552e66a861f3f
|
|
| MD5 |
82f67f2aa953fa24f42c2684c1eb2125
|
|
| BLAKE2b-256 |
d0f06fb2f274a53efd04ac4e9a5477656e3fc50d6cb8e417fbb6c8527944d819
|