Skip to main content

Single-endpoint GenAI SDK (multi-provider, multimodal)

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

nous-genai

CI Python License

中文文档:readme_zh.md

One interface for calling multimodal models; four ways to use: Skill, MCP, CLI, SDK.

Features

  • Multi-provider: OpenAI, Google (Gemini), Anthropic (Claude), Aliyun (DashScope/Bailian), Volcengine (Doubao/Ark), Tuzi
  • Multimodal: text/image/audio/video input and output (model-dependent)
  • Unified API: a single Client.generate() for all providers
  • Streaming: generate_stream() for incremental output
  • Tool calling: function tools (model/provider-dependent)
  • JSON Schema output: structured output (model/provider-dependent)
  • MCP Server: Streamable HTTP and SSE transport
  • Security: SSRF protection, DNS pinning, download limits, Bearer token auth (MCP)

Installation

pip install nous-genai

For development:

pip install -e .
# or (recommended)
uv sync

Configuration (Zero-parameter)

SDK/CLI/MCP loads env files automatically with priority (high → low):

.env.local > .env.production > .env.development > .env.test

Process env vars override .env.* (the loader uses os.environ.setdefault()).

Minimal .env.local (OpenAI only):

NOUS_GENAI_OPENAI_API_KEY=...
NOUS_GENAI_TIMEOUT_MS=120000

See docs/CONFIGURATION.md or copy .env.example to .env.local.

Quickstart

Text generation

from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part

client = Client()
resp = client.generate(
    GenerateRequest(
        model="openai:gpt-4o-mini",
        input=[Message(role="user", content=[Part.from_text("Hello!")])],
        output=OutputSpec(modalities=["text"]),
    )
)
print(resp.output[0].content[0].text)

Streaming

import sys
from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part

client = Client()
req = GenerateRequest(
    model="openai:gpt-4o-mini",
    input=[Message(role="user", content=[Part.from_text("Tell me a joke")])],
    output=OutputSpec(modalities=["text"]),
)
for ev in client.generate_stream(req):
    if ev.type == "output.text.delta":
        sys.stdout.write(str(ev.data.get("delta", "")))
        sys.stdout.flush()
print()

Image understanding

from nous.genai import Client, GenerateRequest, Message, OutputSpec, Part, PartSourcePath
from nous.genai.types import detect_mime_type

path = "./cat.png"
mime = detect_mime_type(path) or "application/octet-stream"

client = Client()
resp = client.generate(
    GenerateRequest(
        model="openai:gpt-4o-mini",
        input=[
            Message(
                role="user",
                content=[
                    Part.from_text("Describe this image"),
                    Part(type="image", mime_type=mime, source=PartSourcePath(path=path)),
                ],
            )
        ],
        output=OutputSpec(modalities=["text"]),
    )
)
print(resp.output[0].content[0].text)

List available models

from nous.genai import Client

client = Client()
print(client.list_all_available_models())

Providers

Provider Notes
openai GPT-4, DALL·E, Whisper, TTS
google Gemini, Imagen, Veo
anthropic Claude
aliyun DashScope / Bailian (OpenAI-compatible + AIGC)
volcengine Ark / Doubao (OpenAI-compatible)
tuzi-web / tuzi-openai / tuzi-google / tuzi-anthropic Tuzi adapters

Binary output

Binary Part.source is a tagged union:

  • Input: bytes/path/base64/url/ref (MCP forbids bytes/path)
  • Output: url/base64/ref (SDK does not auto-download to disk)

If you need to write to file, see examples/demo.py (_write_binary()), or reuse Client.download_to_file() for the built-in safe downloader.

CLI & MCP Server

# CLI
uv run genai --model openai:gpt-4o-mini --prompt "Hello"
uv run genai model available --all

# MCP Server
uv run genai-mcp-server                    # Streamable HTTP: /mcp, SSE: /sse
uv run genai-mcp-cli tools                 # Debug CLI

Security

  • SSRF protection: rejects private/loopback URLs by default (NOUS_GENAI_ALLOW_PRIVATE_URLS=1 to allow)
  • DNS pinning: mitigates DNS rebinding
  • Download limit: 128MiB per URL by default (NOUS_GENAI_URL_DOWNLOAD_MAX_BYTES)
  • Bearer token auth: for MCP server
  • Token rules: fine-grained access control

Testing

uv run pytest tests/ -v

Docs

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nous_genai-0.1.0.tar.gz (111.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nous_genai-0.1.0-py3-none-any.whl (105.2 kB view details)

Uploaded Python 3

File details

Details for the file nous_genai-0.1.0.tar.gz.

File metadata

  • Download URL: nous_genai-0.1.0.tar.gz
  • Upload date:
  • Size: 111.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nous_genai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 6dca872c3aa705e8a89a2e0e9ea00744000ca35a531c129a51cd4b92bc7419c0
MD5 1d4aa087f29b041a0749f91cce4c5ec3
BLAKE2b-256 c3d6e650efdd6d08cf1f4308a7f5002dbf7e6692b5e418ce59a4828eb58866d0

See more details on using hashes here.

Provenance

The following attestation bundles were made for nous_genai-0.1.0.tar.gz:

Publisher: publish.yml on gravtice/nous-genai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nous_genai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: nous_genai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 105.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nous_genai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 32563d79e93907bd07e66ef56a40264f195c85523cefcd8bf7149405ced702f7
MD5 1b8a0c55cb71a1d328b3d62b1aab7c52
BLAKE2b-256 3ca07522a5e50cca52293316844b254c0f23e20f89a8c79433f2cf70761efb11

See more details on using hashes here.

Provenance

The following attestation bundles were made for nous_genai-0.1.0-py3-none-any.whl:

Publisher: publish.yml on gravtice/nous-genai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page