Skip to main content

Single-endpoint GenAI SDK (multi-provider, multimodal)

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

nous-genai

CI Python License

中文文档:readme_zh.md

One interface for calling multimodal models; four ways to use: Skill, MCP, CLI, SDK.

Features

  • Multi-provider: OpenAI, Google (Gemini), Anthropic (Claude), Aliyun (DashScope/Bailian), Volcengine (Doubao/Ark), Tuzi
  • Multimodal: text/image/audio/video input and output (model-dependent)
  • Unified API: a single Client.generate() for all providers
  • Streaming: generate_stream() for incremental output
  • Tool calling: function tools (model/provider-dependent)
  • JSON Schema output: structured output (model/provider-dependent)
  • MCP Server: Streamable HTTP and SSE transport
  • Security: SSRF protection, DNS pinning, download limits, Bearer token auth (MCP)

Installation

pip install nous-genai

For development:

pip install -e .
# or (recommended)
uv sync

Configuration (Zero-parameter)

SDK/CLI/MCP loads env files automatically with priority (high → low):

.env.local > .env.production > .env.development > .env.test

Process env vars override .env.* (the loader uses os.environ.setdefault()).

Minimal .env.local (OpenAI only):

NOUS_GENAI_OPENAI_API_KEY=...
NOUS_GENAI_TIMEOUT_MS=120000

See docs/CONFIGURATION.md or copy .env.example to .env.local.

Quickstart

Text generation

from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part

client = Client()
resp = client.generate(
    GenerateRequest(
        model="openai:gpt-4o-mini",
        input=[Message(role="user", content=[Part.from_text("Hello!")])],
        output=OutputSpec(modalities=["text"]),
    )
)
print(resp.output[0].content[0].text)

Streaming

import sys
from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part

client = Client()
req = GenerateRequest(
    model="openai:gpt-4o-mini",
    input=[Message(role="user", content=[Part.from_text("Tell me a joke")])],
    output=OutputSpec(modalities=["text"]),
)
for ev in client.generate_stream(req):
    if ev.type == "output.text.delta":
        sys.stdout.write(str(ev.data.get("delta", "")))
        sys.stdout.flush()
print()

Image understanding

from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part, PartSourcePath
from nous.genai.types import detect_mime_type

path = "./cat.png"
mime = detect_mime_type(path) or "application/octet-stream"

client = Client()
resp = client.generate(
    GenerateRequest(
        model="openai:gpt-4o-mini",
        input=[
            Message(
                role="user",
                content=[
                    Part.from_text("Describe this image"),
                    Part(type="image", mime_type=mime, source=PartSourcePath(path=path)),
                ],
            )
        ],
        output=OutputSpec(modalities=["text"]),
    )
)
print(resp.output[0].content[0].text)

List available models

from nous.genai import Client

client = Client()
print(client.list_all_available_models())

Providers

Provider Notes
openai GPT-4, DALL·E, Whisper, TTS
google Gemini, Imagen, Veo
anthropic Claude
aliyun DashScope / Bailian (OpenAI-compatible + AIGC)
volcengine Ark / Doubao (OpenAI-compatible)
tuzi-web / tuzi-openai / tuzi-google / tuzi-anthropic Tuzi adapters

Binary output

Binary Part.source is a tagged union:

  • Input: bytes/path/base64/url/ref (MCP forbids bytes/path)
  • Output: url/base64/ref (SDK does not auto-download to disk)

If you need to write to file, see examples/demo.py (_write_binary()), or reuse Client.download_to_file() for the built-in safe downloader.

CLI & MCP Server

# CLI
uv run genai --model openai:gpt-4o-mini --prompt "Hello"
uv run genai model available --all

# Tuzi Chirp music
uv run genai --model tuzi-web:chirp-v3-5 --prompt "Lo-fi hiphop beat, 30s" --no-wait
# ...later
uv run genai --model tuzi-web:chirp-v3-5 --job-id "<job_id>" --output-path demo_suno.mp3 --timeout-ms 600000

# MCP Server
uv run genai-mcp-server                    # Streamable HTTP: /mcp, SSE: /sse
uv run genai-mcp-cli tools                 # Debug CLI

Security

  • SSRF protection: rejects private/loopback URLs by default (NOUS_GENAI_ALLOW_PRIVATE_URLS=1 to allow)
  • DNS pinning: mitigates DNS rebinding
  • Download limit: 128MiB per URL by default (NOUS_GENAI_URL_DOWNLOAD_MAX_BYTES)
  • Bearer token auth: for MCP server
  • Token rules: fine-grained access control

Testing

uv run pytest tests/ -v

Docs

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nous_genai-0.1.2.tar.gz (114.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nous_genai-0.1.2-py3-none-any.whl (106.5 kB view details)

Uploaded Python 3

File details

Details for the file nous_genai-0.1.2.tar.gz.

File metadata

  • Download URL: nous_genai-0.1.2.tar.gz
  • Upload date:
  • Size: 114.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nous_genai-0.1.2.tar.gz
Algorithm Hash digest
SHA256 072f373807d8f5f5438338718988512df3553a64c00a04815acd7f32db636027
MD5 8dc6b9a07b0f3846e4e0a7a5b5fca244
BLAKE2b-256 f6f5bdcb1287cb6699e57f11f7f7f3350fcc8f074b25bd825b2c1ed1fa55caf5

See more details on using hashes here.

Provenance

The following attestation bundles were made for nous_genai-0.1.2.tar.gz:

Publisher: publish.yml on gravtice/nous-genai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nous_genai-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: nous_genai-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 106.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nous_genai-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4a7ef81726557f1470f91694231a3fc40198f30910264ebb1f4e0644517e66bf
MD5 ca21d88b926ef01091678e6bb97f10c0
BLAKE2b-256 9b9de800a640482914e08d939e98d7a2943aa08a78dd23005ed8d5aa70bf08ec

See more details on using hashes here.

Provenance

The following attestation bundles were made for nous_genai-0.1.2-py3-none-any.whl:

Publisher: publish.yml on gravtice/nous-genai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page