Python SDK for Ninetrix — define agent tools in code
Project description
ninetrix-sdk
Python SDK for Ninetrix — build, compose, and deploy AI agents in pure Python.
pip install ninetrix-sdk
What it does
The Ninetrix SDK lets you define agents, tools, workflows, and multi-agent teams entirely in Python. Agents are portable — serialize to YAML, run locally, or deploy to Ninetrix Cloud.
Core Concepts
Agents
Define an agent with a role, tools, and an LLM provider. Run it synchronously, asynchronously, or as a streaming event source.
Agents support structured output (Pydantic models), token/cost budgets, and human-in-the-loop approval gates.
Tools
The @Tool decorator turns any Python function into an agent tool. The function signature is automatically converted to a JSON Schema — no manual wiring needed.
Group related tools with Toolkit for cleaner organization.
Tool Sources
Agents can use tools from multiple sources, all mixed together:
| Source | Description |
|---|---|
@Tool functions |
Local Python functions |
| MCP servers | Via MCP Gateway (JSON-RPC) |
| OpenAPI specs | Any REST API with an OpenAPI 3.x spec |
| Composio | Composio SDK integrations |
| Community plugins | pip install ninetrix-source-* (auto-discovered) |
The plugin system uses standard Python entry_points — anyone can publish a new tool source.
Workflows
The @Workflow decorator defines sequential pipelines that chain agents together. Workflows support:
- Durable execution — checkpoint every step, resume on crash
- Declarative steps —
run_step(name, fn)for clean step definitions - Fan-out —
map(agent, items, concurrency)for parallel execution - Early exit —
terminate(reason)for explicit abort
Teams
Team provides LLM-based dynamic routing across multiple agents. The router agent picks the best specialist for each request — no manual if/else chains.
Ninetrix Context
The Ninetrix factory class sets provider and model defaults once. All agents, teams, and workflows created from the context inherit the configuration.
Providers
Built-in LLM adapters with a unified interface:
- Anthropic (Claude)
- OpenAI (GPT)
- Google (Gemini)
- LiteLLM (100+ models)
- Fallback chains (try providers in order)
All provider exceptions are wrapped — raw third-party errors never surface.
Observability
- EventBus — pub-sub event system for all agent lifecycle events
- OpenTelemetry — opt-in tracing (graceful no-op if OTEL SDK not installed)
- Debug mode — pretty-print all events to stderr
- Dashboard telemetry —
RunnerReporterposts events for trace visualization - Token tracking — per-turn and per-run budget monitoring with cost estimates
Persistence
- InMemoryCheckpointer — for local dev and testing
- PostgresCheckpointer — production-grade with
SELECT FOR UPDATE - Durable workflows automatically resume from the last successful step on crash
Testing
Built-in testing utilities for deterministic agent tests without real LLM calls:
MockTool— fake tools with call tracking and assertionsMockProvider— scripted LLM responsesAgentSandbox— isolated test harness with full assertion API
Deployment
Agents can be served, built, and deployed from code:
agent.serve()— FastAPI HTTP server (/invoke,/stream,/health,/info)agent.build()— serialize to YAML and build a Docker image vianinetrix buildagent.deploy()— push to Ninetrix Cloud
YAML Serialization
Every agent is serializable from day one:
agent.to_yaml()— export toagentfile.yamlformatAgent.from_yaml()— load from YAML back into a live agent
This makes agents portable between code-first and YAML-first workflows.
Optional Dependencies
Install only what you need:
pip install 'ninetrix-sdk[anthropic]' # Anthropic provider
pip install 'ninetrix-sdk[openai]' # OpenAI provider
pip install 'ninetrix-sdk[google]' # Google provider
pip install 'ninetrix-sdk[providers]' # All providers
pip install 'ninetrix-sdk[serve]' # FastAPI server
pip install 'ninetrix-sdk[otel]' # OpenTelemetry tracing
Ecosystem
The SDK works with two community registries:
- Tools Hub — community tool registry. Use
hub://URIs inagentfile.yamlto pull tools at build time. Each tool has aTOOL.yamlmanifest with dependencies, credentials, and companion skills. - Skills Hub — community skills library. Skills are prompt-layer playbooks (
SKILL.md) injected into your agent's system prompt at build time. They teach agents how to work — no code required.
Tools provide capabilities. Skills teach how to use them. The SDK connects both via the @Tool decorator, ToolSource plugins, and YAML serialization.
License
Apache 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ninetrix_sdk-0.1.1.tar.gz.
File metadata
- Download URL: ninetrix_sdk-0.1.1.tar.gz
- Upload date:
- Size: 262.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f2542d49bd25906fddc99141139bbd4e7c79795ef3ab5af2f26f5a15aab7855c
|
|
| MD5 |
31cafa85d60d19af9dd3023787575590
|
|
| BLAKE2b-256 |
35aa6f0f9251ea68019cee8be5fbec9fe43252d5e33d394047c04c26bad54c7a
|
File details
Details for the file ninetrix_sdk-0.1.1-py3-none-any.whl.
File metadata
- Download URL: ninetrix_sdk-0.1.1-py3-none-any.whl
- Upload date:
- Size: 178.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
745f5dbfe2d47b8679470c8318f6acdc9667afafbc09c47c06e9c24a16b16f19
|
|
| MD5 |
f751a7eb902e2063350bf637e25c5a65
|
|
| BLAKE2b-256 |
f12814adca8f5a77dbfe5687f28203a0a569bc24a72a5b92c33ca9fbc5f0d915
|