model sdk built by the 9th ditrict at tooig
Project description
nineth
nineth is the Python SDK for the 1984 model API, built by the 9th District at Tooig.
Install
pip install nineth
export NINETH_API_KEY="your-api-key"
How it works
Every request goes through client.model.request(...).
- Pass a task. Get a response.
- Set
stream=Trueto receive text as it arrives, word by word. - Everything else — caching, service routing, continuity — is handled server-side.
Models
| Name | Description |
|---|---|
1984-m3-0317 |
Most capable. Best for research and complex tasks. |
1984-m2-preview |
Fast and powerful. Good for most tasks. |
1984-m2-light |
Lightweight, quick general tasks. |
1984-m1-unified |
High-throughput unified model. |
1984-m0-brute |
Compact efficient model. |
1984-m0-sm |
Smallest model, fastest responses. |
Set a default at client creation or pass model= per call.
Cookbook
1 — Get a response
The simplest case. Ask something, get the answer.
from nineth import NinethClient
with NinethClient(default_model="1984-m3-0317") as client:
response = client.model.request("Give me a tight BTC market brief.")
print(response["final_response"])
response is a plain dict. The text is always in response["final_response"].
2 — Stream the response live
Set stream=True to print text as it arrives.
from nineth import NinethClient
with NinethClient(default_model="1984-m3-0317") as client:
for event in client.model.request("Summarise crude oil today.", stream=True):
if event["type"] == "model_delta":
print(event["data"]["text"], end="", flush=True)
The last event in the stream is type: result and contains the full final_response
alongside iterations.
3 — Choose a different model per request
from nineth import NinethClient
with NinethClient() as client:
response = client.model.request(
"What happened with Nvidia earnings?",
model="1984-m2-light",
)
print(response["final_response"])
4 — Control reasoning depth
Use reasoning to hint at how deeply the model should think before answering.
Valid values: "low", "medium", "high". Leave it out to use the model default.
from nineth import NinethClient
with NinethClient(default_model="1984-m3-0317") as client:
response = client.model.request(
"Analyse the macro impact of a Fed rate pause.",
reasoning="high",
)
print(response["final_response"])
5 — Show the model's reasoning
Set show_reasoning=True to include the model's internal chain-of-thought.
This is off by default.
from nineth import NinethClient
with NinethClient(default_model="1984-m3-0317") as client:
response = client.model.request(
"Walk me through whether gold is trending or ranging.",
reasoning="medium",
show_reasoning=True,
)
for block in response.get("thinking", []):
print("[thinking]", block)
print(response["final_response"])
6 — Limit how many turns the model takes
max_iterations controls how many model turns the server runs.
The default is 10. Most tasks finish in 1–3 turns.
from nineth import NinethClient
with NinethClient(default_model="1984-m3-0317") as client:
response = client.model.request(
"Give me a one-paragraph ETH brief.",
max_iterations=2,
)
print(response["final_response"])
7 — Async usage
import asyncio
from nineth import AsyncNinethClient
async def main():
async with AsyncNinethClient(default_model="1984-m3-0317") as client:
response = await client.model.request(
"Summarise macro risk factors this week.",
)
print(response["final_response"])
asyncio.run(main())
Async streaming works the same way:
import asyncio
from nineth import AsyncNinethClient
async def main():
async with AsyncNinethClient(default_model="1984-m3-0317") as client:
async for event in await client.model.request(
"Research BTC ETF flows.", stream=True
):
if event["type"] == "model_delta":
print(event["data"]["text"], end="", flush=True)
asyncio.run(main())
8 — Health check
No API key needed. Use this to verify the endpoint is reachable.
from nineth import NinethClient
with NinethClient() as client:
print(client.health())
# {'status': 'ok', 'timestamp': '2026-04-04T00:00:00+00:00'}
9 — Point the SDK at a different endpoint
from nineth import NinethClient
with NinethClient(
base_url="https://your-deployment.modal.run",
api_key="your-key",
default_model="1984-m3-0317",
) as client:
response = client.model.request("Hello.")
print(response["final_response"])
Or use environment variables:
export NINETH_BASE_URL="https://your-deployment.modal.run"
export NINETH_MODEL="1984-m3-0317"
Response shape
Buffered (stream=False)
{
"final_response": "Bitcoin is trading near...",
"iterations": 2,
"usage": {"prompt_tokens": 412, "completion_tokens": 88, "total_tokens": 500},
"thinking": [], # only populated when show_reasoning=True
"service_calls": [...],
"service_responses": [...],
"events": [...],
}
Only final_response and iterations are guaranteed to be present on every response.
Streaming (stream=True)
Each loop iteration yields a dict:
# Text arriving live
{"type": "model_delta", "data": {"text": "Bitcoin is trading..."}}
# Tool calls the model made
{"type": "service_call", "data": {"service_name": "search_web", "params": {...}}}
{"type": "service_response", "data": {"service_name": "search_web", "success": True, "summary": {...}}}
# Final summary — always the last event
{"type": "result", "data": {"final_response": "...", "iterations": 2}}
Error handling
from nineth import NinethClient, NinethAPIError
with NinethClient(default_model="1984-m3-0317") as client:
try:
response = client.model.request("Analyse ETH.")
except NinethAPIError as exc:
print("API error:", exc)
except ValueError as exc:
print("Configuration error:", exc)
Authentication
Set NINETH_API_KEY in your environment or pass api_key= to the client constructor.
The health check endpoint does not require a key.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nineth-0.1.3.tar.gz.
File metadata
- Download URL: nineth-0.1.3.tar.gz
- Upload date:
- Size: 8.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1f590539e3387e302d966bcbab252d3f165e98882733bc54d4c68eb777dcd80a
|
|
| MD5 |
affeeff43179bf4b72ebe0ac79e0fc4d
|
|
| BLAKE2b-256 |
e00e886d11671d1a6628ac15af4ba50a82dccf496e12011425fffd4b9f0d213c
|
File details
Details for the file nineth-0.1.3-py3-none-any.whl.
File metadata
- Download URL: nineth-0.1.3-py3-none-any.whl
- Upload date:
- Size: 7.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
50ed706119b233b407a7c05a41d8b8032d3726e8684ca964f3bba95059aa12ad
|
|
| MD5 |
6df0b3b6ebefb6ebd3be781ec33ecf83
|
|
| BLAKE2b-256 |
36b7e94e05070a90f5a880bbfa52a98b4265738c96d8c6c525fb0714d3d70608
|