Python SDK for the Nadir LLM Router — intelligent model routing that cuts API costs 30-60%
Project description
Nadir Python SDK
Official Python client for the Nadir LLM router — intelligent model routing that cuts API costs 30-60%.
Installation
pip install nadir-sdk
Quick Start
from nadir import NadirClient
client = NadirClient(api_key="ndr_...")
# Chat completion — Nadir picks the optimal model
response = client.chat.completions.create(
messages=[{"role": "user", "content": "What is 2+2?"}],
)
print(response.choices[0].message.content)
# See which model was selected and why
print(response.nadir_metadata.tier) # "simple"
print(response.nadir_metadata.selected_model) # "gpt-4o-mini"
print(response.nadir_metadata.complexity_score) # 0.12
Streaming
stream = client.chat.completions.create(
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True,
)
for chunk in stream:
print(chunk.choices[0].delta.content or "", end="", flush=True)
Model Recommendation (no LLM call)
rec = client.recommend("Explain quantum entanglement in detail")
print(rec) # {"recommended_model": "claude-sonnet-4-20250514", "complexity": ...}
Async
import asyncio
from nadir import AsyncNadirClient
async def main():
async with AsyncNadirClient(api_key="ndr_...") as client:
response = await client.chat.completions.create(
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
asyncio.run(main())
Advanced: Fallback & Routing Control
response = client.chat.completions.create(
messages=[{"role": "user", "content": "Complex analysis..."}],
route="fallback", # enable auto-fallback
fallback_models=["claude-sonnet-4-20250514", "gpt-4o"], # explicit fallback chain
layers={"routing": True, "optimize": True}, # per-request feature toggles
reasoning={"effort": "high"}, # reasoning token support
)
Environment Variables
| Variable | Description | Default |
|---|---|---|
NADIR_API_KEY |
API key (fallback if not passed to constructor) | — |
NADIR_BASE_URL |
API base URL | https://api.getnadir.dev |
OpenAI Drop-in Compatibility
Nadir's API is OpenAI-compatible. You can also use the OpenAI SDK directly:
from openai import OpenAI
client = OpenAI(
base_url="https://api.getnadir.dev/v1",
api_key="ndr_...",
)
response = client.chat.completions.create(
model="auto",
messages=[{"role": "user", "content": "Hello"}],
)
The Nadir SDK adds typed access to routing metadata, recommendations, smart export, and clustering that the OpenAI SDK can't reach.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nadir_sdk-0.1.0.tar.gz.
File metadata
- Download URL: nadir_sdk-0.1.0.tar.gz
- Upload date:
- Size: 11.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
16a65acae8f7c19594e4ceda8650ebd7870fe7b9e08c12ee7015280fb3212a93
|
|
| MD5 |
61a034f1735e9cc03aec9d16be7e242b
|
|
| BLAKE2b-256 |
d1dd46476aa2468c9010fede62ef162c8a745e09a5704e9978d5cce96b23b28e
|
File details
Details for the file nadir_sdk-0.1.0-py3-none-any.whl.
File metadata
- Download URL: nadir_sdk-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aa6e310716766ef0d3f0235924fcbcef9e24cd88bccd444512ec7e627ddb1cf8
|
|
| MD5 |
8ff8a18c592442c1dfc31ff8b1a2e821
|
|
| BLAKE2b-256 |
5034e6bd9f75b65b5e5eee98daeb1c7f8369e9344d35af8ad605bfc147ba5783
|