Single-endpoint GenAI SDK (multi-provider, multimodal)
This project has been archived.
The maintainers of this project have marked this project as archived. No new releases are expected.
Project description
nous-genai
中文文档:readme_zh.md
One interface for calling multimodal models; four ways to use: Skill, MCP, CLI, SDK.
Features
- Multi-provider: OpenAI, Google (Gemini), Anthropic (Claude), Aliyun (DashScope/Bailian), Volcengine (Doubao/Ark), Tuzi
- Multimodal: text/image/audio/video input and output (model-dependent)
- Unified API: a single
Client.generate()for all providers - Streaming:
generate_stream()for incremental output - Tool calling: function tools (model/provider-dependent)
- JSON Schema output: structured output (model/provider-dependent)
- MCP Server: Streamable HTTP and SSE transport
- Security: SSRF protection, DNS pinning, download limits, Bearer token auth (MCP)
Installation
pip install nous-genai
For development:
pip install -e .
# or (recommended)
uv sync
Configuration (Zero-parameter)
SDK/CLI/MCP loads env files automatically with priority (high → low):
.env.local > .env.production > .env.development > .env.test
Process env vars override .env.* (the loader uses os.environ.setdefault()).
Minimal .env.local (OpenAI only):
NOUS_GENAI_OPENAI_API_KEY=...
NOUS_GENAI_TIMEOUT_MS=120000
See docs/CONFIGURATION.md or copy .env.example to .env.local.
Quickstart
Text generation
from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part
client = Client()
resp = client.generate(
GenerateRequest(
model="openai:gpt-4o-mini",
input=[Message(role="user", content=[Part.from_text("Hello!")])],
output=OutputSpec(modalities=["text"]),
)
)
print(resp.output[0].content[0].text)
Streaming
import sys
from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part
client = Client()
req = GenerateRequest(
model="openai:gpt-4o-mini",
input=[Message(role="user", content=[Part.from_text("Tell me a joke")])],
output=OutputSpec(modalities=["text"]),
)
for ev in client.generate_stream(req):
if ev.type == "output.text.delta":
sys.stdout.write(str(ev.data.get("delta", "")))
sys.stdout.flush()
print()
Image understanding
from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part, PartSourcePath
from nous.genai.types import detect_mime_type
path = "./cat.png"
mime = detect_mime_type(path) or "application/octet-stream"
client = Client()
resp = client.generate(
GenerateRequest(
model="openai:gpt-4o-mini",
input=[
Message(
role="user",
content=[
Part.from_text("Describe this image"),
Part(type="image", mime_type=mime, source=PartSourcePath(path=path)),
],
)
],
output=OutputSpec(modalities=["text"]),
)
)
print(resp.output[0].content[0].text)
List available models
from nous.genai import Client
client = Client()
print(client.list_all_available_models())
Providers
| Provider | Notes |
|---|---|
openai |
GPT-4, DALL·E, Whisper, TTS |
google |
Gemini, Imagen, Veo |
anthropic |
Claude |
aliyun |
DashScope / Bailian (OpenAI-compatible + AIGC) |
volcengine |
Ark / Doubao (OpenAI-compatible) |
tuzi-web / tuzi-openai / tuzi-google / tuzi-anthropic |
Tuzi adapters |
Binary output
Binary Part.source is a tagged union:
- Input:
bytes/path/base64/url/ref(MCP forbidsbytes/path) - Output:
url/base64/ref(SDK does not auto-download to disk)
If you need to write to file, see examples/demo.py (_write_binary()), or reuse Client.download_to_file() for the built-in safe downloader.
CLI & MCP Server
# CLI
uv run genai --model openai:gpt-4o-mini --prompt "Hello"
uv run genai model available --all
# Tuzi Chirp music
uv run genai --model tuzi-web:chirp-v3-5 --prompt "Lo-fi hiphop beat, 30s" --no-wait
# ...later
uv run genai --model tuzi-web:chirp-v3-5 --job-id "<job_id>" --output-path demo_suno.mp3 --timeout-ms 600000
# MCP Server
uv run genai-mcp-server # Streamable HTTP: /mcp, SSE: /sse
uv run genai-mcp-cli tools # Debug CLI
Security
- SSRF protection: rejects private/loopback URLs by default (
NOUS_GENAI_ALLOW_PRIVATE_URLS=1to allow) - DNS pinning: mitigates DNS rebinding
- Download limit: 128MiB per URL by default (
NOUS_GENAI_URL_DOWNLOAD_MAX_BYTES) - Bearer token auth: for MCP server
- Token rules: fine-grained access control
Testing
uv run pytest tests/ -v
Docs
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nous_genai-0.1.2.tar.gz.
File metadata
- Download URL: nous_genai-0.1.2.tar.gz
- Upload date:
- Size: 114.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
072f373807d8f5f5438338718988512df3553a64c00a04815acd7f32db636027
|
|
| MD5 |
8dc6b9a07b0f3846e4e0a7a5b5fca244
|
|
| BLAKE2b-256 |
f6f5bdcb1287cb6699e57f11f7f7f3350fcc8f074b25bd825b2c1ed1fa55caf5
|
Provenance
The following attestation bundles were made for nous_genai-0.1.2.tar.gz:
Publisher:
publish.yml on gravtice/nous-genai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
nous_genai-0.1.2.tar.gz -
Subject digest:
072f373807d8f5f5438338718988512df3553a64c00a04815acd7f32db636027 - Sigstore transparency entry: 910275082
- Sigstore integration time:
-
Permalink:
gravtice/nous-genai@07fd4d31423c5f44b91ea5825826d57e5b0d71bb -
Branch / Tag:
refs/tags/v0.1.2 - Owner: https://github.com/gravtice
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@07fd4d31423c5f44b91ea5825826d57e5b0d71bb -
Trigger Event:
push
-
Statement type:
File details
Details for the file nous_genai-0.1.2-py3-none-any.whl.
File metadata
- Download URL: nous_genai-0.1.2-py3-none-any.whl
- Upload date:
- Size: 106.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4a7ef81726557f1470f91694231a3fc40198f30910264ebb1f4e0644517e66bf
|
|
| MD5 |
ca21d88b926ef01091678e6bb97f10c0
|
|
| BLAKE2b-256 |
9b9de800a640482914e08d939e98d7a2943aa08a78dd23005ed8d5aa70bf08ec
|
Provenance
The following attestation bundles were made for nous_genai-0.1.2-py3-none-any.whl:
Publisher:
publish.yml on gravtice/nous-genai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
nous_genai-0.1.2-py3-none-any.whl -
Subject digest:
4a7ef81726557f1470f91694231a3fc40198f30910264ebb1f4e0644517e66bf - Sigstore transparency entry: 910275083
- Sigstore integration time:
-
Permalink:
gravtice/nous-genai@07fd4d31423c5f44b91ea5825826d57e5b0d71bb -
Branch / Tag:
refs/tags/v0.1.2 - Owner: https://github.com/gravtice
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@07fd4d31423c5f44b91ea5825826d57e5b0d71bb -
Trigger Event:
push
-
Statement type: