Skip to main content

model sdk built by the 9th ditrict at tooig

Project description

nineth

nineth is the Python SDK for the 1984 model API, built by the 9th District at Tooig.


Install

pip install nineth
export NINETH_API_KEY="your-api-key"

How it works

Every request goes through client.model.request(...).

  • Pass a task. Get a response.
  • Set stream=True to receive text as it arrives, word by word.
  • Everything else — caching, service routing, continuity — is handled server-side.

Models

Name Description
1984-m3-0317 Most capable. Best for research and complex tasks.
1984-m2-preview Fast and powerful. Good for most tasks.
1984-m2-light Lightweight, quick general tasks.
1984-m1-unified High-throughput unified model.
1984-m0-brute Compact efficient model.
1984-m0-sm Smallest model, fastest responses.

Set a default at client creation or pass model= per call.


Cookbook

1 — Get a response

The simplest case. Ask something, get the answer.

from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request("Give me a tight BTC market brief.")
    print(response["final_response"])

response is a plain dict. The text is always in response["final_response"].


2 — Stream the response live

Set stream=True to print text as it arrives.

from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    for event in client.model.request("Summarise crude oil today.", stream=True):
        if event["type"] == "model_delta":
            print(event["data"]["text"], end="", flush=True)

The last event in the stream is type: result and contains the full final_response alongside iterations.


3 — Choose a different model per request

from nineth import NinethClient

with NinethClient() as client:
    response = client.model.request(
        "What happened with Nvidia earnings?",
        model="1984-m2-light",
    )
    print(response["final_response"])

4 — Control reasoning depth

Use reasoning to hint at how deeply the model should think before answering. Valid values: "low", "medium", "high". Leave it out to use the model default.

from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request(
        "Analyse the macro impact of a Fed rate pause.",
        reasoning="high",
    )
    print(response["final_response"])

5 — Show the model's reasoning

Set show_reasoning=True to include the model's internal chain-of-thought. This is off by default.

from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request(
        "Walk me through whether gold is trending or ranging.",
        reasoning="medium",
        show_reasoning=True,
    )
    for block in response.get("thinking", []):
        print("[thinking]", block)
    print(response["final_response"])

6 — Limit how many turns the model takes

max_iterations controls how many model turns the server runs. The default is 10. Most tasks finish in 1–3 turns.

from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request(
        "Give me a one-paragraph ETH brief.",
        max_iterations=2,
    )
    print(response["final_response"])

7 — Async usage

import asyncio
from nineth import AsyncNinethClient

async def main():
    async with AsyncNinethClient(default_model="1984-m3-0317") as client:
        response = await client.model.request(
            "Summarise macro risk factors this week.",
        )
        print(response["final_response"])

asyncio.run(main())

Async streaming works the same way:

import asyncio
from nineth import AsyncNinethClient

async def main():
    async with AsyncNinethClient(default_model="1984-m3-0317") as client:
        async for event in await client.model.request(
            "Research BTC ETF flows.", stream=True
        ):
            if event["type"] == "model_delta":
                print(event["data"]["text"], end="", flush=True)

asyncio.run(main())

8 — Health check

No API key needed. Use this to verify the endpoint is reachable.

from nineth import NinethClient

with NinethClient() as client:
    print(client.health())
# {'status': 'ok', 'timestamp': '2026-04-04T00:00:00+00:00'}

9 — Point the SDK at a different endpoint

from nineth import NinethClient

with NinethClient(
    base_url="https://your-deployment.modal.run",
    api_key="your-key",
    default_model="1984-m3-0317",
) as client:
    response = client.model.request("Hello.")
    print(response["final_response"])

Or use environment variables:

export NINETH_BASE_URL="https://your-deployment.modal.run"
export NINETH_MODEL="1984-m3-0317"

Response shape

Buffered (stream=False)

{
    "final_response": "Bitcoin is trading near...",
    "iterations": 2,
    "usage": {"prompt_tokens": 412, "completion_tokens": 88, "total_tokens": 500},
    "thinking": [],          # only populated when show_reasoning=True
    "service_calls": [...],
    "service_responses": [...],
    "events": [...],
}

Only final_response and iterations are guaranteed to be present on every response.

Streaming (stream=True)

Each loop iteration yields a dict:

# Text arriving live
{"type": "model_delta", "data": {"text": "Bitcoin is trading..."}}

# Tool calls the model made
{"type": "service_call",     "data": {"service_name": "search_web", "params": {...}}}
{"type": "service_response", "data": {"service_name": "search_web", "success": True, "summary": {...}}}

# Final summary — always the last event
{"type": "result", "data": {"final_response": "...", "iterations": 2}}

Error handling

from nineth import NinethClient, NinethAPIError

with NinethClient(default_model="1984-m3-0317") as client:
    try:
        response = client.model.request("Analyse ETH.")
    except NinethAPIError as exc:
        print("API error:", exc)
    except ValueError as exc:
        print("Configuration error:", exc)

Authentication

Set NINETH_API_KEY in your environment or pass api_key= to the client constructor. The health check endpoint does not require a key.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nineth-0.1.15.tar.gz (10.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nineth-0.1.15-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file nineth-0.1.15.tar.gz.

File metadata

  • Download URL: nineth-0.1.15.tar.gz
  • Upload date:
  • Size: 10.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for nineth-0.1.15.tar.gz
Algorithm Hash digest
SHA256 9d3f2aaa6dceb570ad0596e3f2852147cd24d22b9eebb3e7cdd26020f443cc40
MD5 a988abf50d6b220deba6ae7086c7bd8f
BLAKE2b-256 de309d9a2fb9a5e60f953ce7d3a9dd30da6f14321658c03bbd71391c580bde5a

See more details on using hashes here.

File details

Details for the file nineth-0.1.15-py3-none-any.whl.

File metadata

  • Download URL: nineth-0.1.15-py3-none-any.whl
  • Upload date:
  • Size: 9.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for nineth-0.1.15-py3-none-any.whl
Algorithm Hash digest
SHA256 1c2ba1c7c83d8e688a5bfa59df3dc7b9638a2a26c045362a3d5bed0367ff0fea
MD5 0e208912d28145fe937e807c0a570cf8
BLAKE2b-256 9456749860dbb03b360c73d6d8a75b5b425560afc9384da1a681bfaf11809bf6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page