Official Python SDK for AllToken — one API for OpenAI, Anthropic, and 100+ models.
Project description
alltoken-ai
Official Python SDK for AllToken — one API for OpenAI, Anthropic, and 100+ models.
pip install alltoken-ai
Requires Python 3.10+.
Quick start
from alltoken import AllToken
client = AllToken(api_key="...") # or os.environ["ALLTOKEN_API_KEY"]
# OpenAI-compatible surface (maps to /v1)
resp = client.openai.raw.post(
"/chat/completions",
json={
"model": "gpt-4o",
"messages": [{"role": "user", "content": "Hello!"}],
},
)
print(resp.json())
# Anthropic-compatible surface (maps to /anthropic)
resp = client.anthropic.raw.post(
"/messages",
json={
"model": "claude-sonnet-4",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}],
},
)
print(resp.json())
The same API key works for both surfaces. Model catalog: alltoken.ai/models.
Configuration
AllToken(
api_key="...", # required
base_url="https://api.alltoken.ai", # optional, defaults to production
default_headers={"X-My-Tag": "a"}, # optional, merged into every request
)
API surface
| Field | Spec | Base URL |
|---|---|---|
client.openai.raw |
chat.yml (OpenAI-compatible) |
https://api.alltoken.ai/v1 |
client.anthropic.raw |
anthropic.yml |
https://api.alltoken.ai/anthropic |
.raw is a pre-configured httpx.Client — base URL + auth are set, call .get() / .post() / .stream() directly. Pydantic models for request/response bodies are generated from the OpenAPI specs into alltoken.generated.chat and alltoken.generated.anthropic.
Status
v0.1.0 — Scaffold. Pydantic models are generated from the spec, the wrapper surface is minimal. Expect breaking changes in 0.x. Ergonomic helpers (client.chat.completions.create(...), async streaming iterators, retries, etc.) are coming in 0.2.x.
Contributing / Local development
# Clone megaopenrouter as a sibling (for the OpenAPI specs)
git clone git@gitlab.53site.com:ai-innovation-lab/megaopenrouter.git ../megaopenrouter
# Install with dev deps
pip install -e ".[dev]"
# Regenerate pydantic models from specs
python scripts/generate.py
# Test + lint
pytest
ruff check
mypy .
Generated models live in src/alltoken/generated/{chat,anthropic}.py — these are committed so users who install from PyPI don't need to run codegen.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file alltoken-0.2.0.tar.gz.
File metadata
- Download URL: alltoken-0.2.0.tar.gz
- Upload date:
- Size: 9.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fe82a77cc1a2e9cd32b35d122da54e8a0f702d56d9ebe5b55a13d926674b7402
|
|
| MD5 |
24f55fc8d6201baa0134aaf8944a01fc
|
|
| BLAKE2b-256 |
c807b9f2954be2385b5b1ead44088702fd5e03ebcfc4638bd0c5795a31f06bad
|
File details
Details for the file alltoken-0.2.0-py3-none-any.whl.
File metadata
- Download URL: alltoken-0.2.0-py3-none-any.whl
- Upload date:
- Size: 13.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a433eda21459fc6fb1efa76d09a460b6a0a79629e264eb39c47c8957662f4ab2
|
|
| MD5 |
e0679b18ccc11364d8bceaf0d77810fb
|
|
| BLAKE2b-256 |
253d4038cd73483605c66b7c4f1c4de159264158c4d088f86868687093ee25fb
|