Skip to main content

A unified Python client to normalize interfaces across major LLM providers (OpenAI, Anthropic, Gemini, DeepSeek, xAI).

Project description

onellmclient

统一主要 LLM 提供商接口的 Python 客户端,让你用一套 API 调用 OpenAI、Anthropic、Gemini、DeepSeek、xAI 等不同厂商的模型。

✨ 特性

  • 统一接口:一套 API 调用多个 LLM 厂商,无需学习不同 SDK
  • 开箱即用:一次安装,支持所有主流 LLM 提供商(OpenAI、Anthropic、Gemini、DeepSeek、xAI)
  • 透明切换:随时切换不同的模型提供商,代码几乎无需改动
  • 完整功能:支持文本生成、工具调用、结构化输出等核心功能

📦 安装

# 使用 uv(推荐)
uv add onellmclient

# 或使用 pip
pip install onellmclient

注意:安装 onellmclient 会自动安装所有支持的 LLM 提供商 SDK(OpenAI、Anthropic、Gemini),DeepSeek 和 xAI 使用 OpenAI 兼容的 API,无需额外依赖。

🚀 快速开始

基础用法

from onellmclient import Client

# 初始化客户端(支持多个厂商)
client = Client(
    openai={"key": "your-openai-api-key"},
    anthropic={"key": "your-anthropic-api-key"},
    gemini={"key": "your-gemini-api-key"},
    deepseek={"key": "your-deepseek-api-key"},
    xai={"key": "your-xai-api-key"}
)

# 调用 OpenAI 模型
response = client.completion(
    provider="openai",
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "你好,请介绍一下自己"}]
)
print(response.content)

# 切换到 Anthropic 模型,代码几乎不变
response = client.completion(
    provider="anthropic",
    model="claude-3-5-sonnet",
    messages=[{"role": "user", "content": "你好,请介绍一下自己"}]
)
print(response.content)

# 切换到 DeepSeek 模型,同样简单
response = client.completion(
    provider="deepseek",
    model="deepseek-v3.2-exp",
    messages=[{"role": "user", "content": "你好,请介绍一下自己"}]
)
print(response.content)

# 切换到 xAI Grok 模型
response = client.completion(
    provider="xai",
    model="grok-beta",
    messages=[{"role": "user", "content": "你好,请介绍一下自己"}]
)
print(response.content)

高级功能

结构化输出

# 定义 JSON Schema
schema = {
    "type": "object",
    "properties": {
        "name": {"type": "string"},
        "age": {"type": "integer"},
        "hobbies": {"type": "array", "items": {"type": "string"}}
    },
    "required": ["name", "age"]
}

response = client.completion(
    provider="openai",
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "请介绍一个虚构的人物"}],
    schema=schema
)
# response.content 将是一个符合 schema 的 JSON 字符串

工具调用

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "获取指定城市的天气信息",
            "parameters": {
                "type": "object",
                "properties": {
                    "city": {"type": "string", "description": "城市名称"}
                },
                "required": ["city"]
            }
        }
    }
]

response = client.completion(
    provider="openai",
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "北京今天天气怎么样?"}],
    tools=tools
)

Agent:自动执行工具(含 DeepSeek Web Search 兜底)

completion() 只会返回 tool call;如果你希望自动执行工具并拿到最终答案,请使用 agent()

对于不支持原生 web_search 的提供商(例如 DeepSeek),可以把 web_search 参数传成库内置的 gemini_grounding,让模型通过调用统一工具名 web_search 获得联网能力:

import os
from onellmclient import Client
from onellmclient.tools import gemini_grounding

client = Client(
    deepseek={"key": os.getenv("DEEPSEEK_API_KEY"), "base": os.getenv("DEEPSEEK_API_BASE")},
    gemini={"key": os.getenv("GEMINI_API_KEY"), "base": os.getenv("GEMINI_API_BASE")},
)

resp = client.agent(
    provider="deepseek",
    model="deepseek-chat",
    instructions="Use the web_search tool to retrieve up-to-date information when needed.",
    messages=[{"role": "user", "content": "What is the result of the latest F1 grand prix?"}],
    web_search=gemini_grounding,
)
print(resp.content)

推理能力(Claude)

response = client.completion(
    provider="anthropic",
    model="claude-3-5-sonnet",
    messages=[{"role": "user", "content": "解这个数学题:2x + 5 = 13"}],
    reasoning_effort="medium"  # off, minimal, low, medium, high
)

📋 支持的厂商和模型

厂商 支持的模型 特殊功能
OpenAI gpt-4o, gpt-4o-mini, gpt-4, gpt-3.5-turbo 等 结构化输出、工具调用、网络搜索
Anthropic claude-3-5-sonnet, claude-3-opus, claude-3-haiku 等 推理能力、工具调用
Gemini gemini-1.5-pro, gemini-1.5-flash 等 工具调用
DeepSeek deepseek-v3.2-exp, deepseek-chat 等 结构化输出、工具调用
xAI grok-beta, grok-vision-beta 等 结构化输出、工具调用、推理能力、网络搜索

🔧 API 参考

Client 初始化

Client(
    openai={"key": "api-key", "base": "https://api.openai.com/v1"},     # 可选
    anthropic={"key": "api-key", "base": "https://api.anthropic.com"}, # 可选
    gemini={"key": "api-key", "base": "https://generativelanguage.googleapis.com"}, # 可选
    deepseek={"key": "api-key", "base": "https://api.deepseek.com"},   # 可选
    xai={"key": "api-key", "base": "https://api.x.ai/v1"}              # 可选
)

completion 方法

client.completion(
    provider: str,                    # "openai", "anthropic", "gemini", "deepseek", "xai"
    model: str,                       # 模型名称
    messages: List[Dict],             # 消息列表
    instructions: Optional[str],      # 系统指令
    schema: Optional[Dict],           # JSON Schema(结构化输出)
    tools: Optional[List[Dict]],      # 工具定义
    reasoning_effort: Optional[str],  # 推理能力:"off", "minimal", "low", "medium", "high"(Anthropic, xAI)
    temperature: Optional[float],     # 温度参数 0-2
    web_search: bool,                 # 是否启用网络搜索(OpenAI, xAI)
    tool_choice: Optional[str]        # 工具选择策略:"auto", "none", "required"
)

agent 方法

agent() 会自动执行模型返回的 tool call,并把 tool 结果回填给模型,直到拿到最终回答。

client.agent(
    provider: str,
    model: str,
    messages: List[Dict],
    instructions: Optional[str],
    schema: Optional[Dict],
    tools: Optional[List[Dict]],      # 注意:agent 模式下每个工具需额外提供 handler 字段(Python callable)
    reasoning_effort: Optional[str],
    temperature: Optional[float],
    web_search: bool | Callable,      # bool: 启用提供商原生 web_search;Callable: 注入统一工具名 web_search 并由 agent 执行
    tool_choice: Optional[str],
)

💡 最佳实践

  1. 环境变量管理:将 API 密钥存储在环境变量中
import os
client = Client(
    openai={"key": os.getenv("OPENAI_API_KEY")},
    anthropic={"key": os.getenv("ANTHROPIC_API_KEY")},
    deepseek={"key": os.getenv("DEEPSEEK_API_KEY")},
    xai={"key": os.getenv("XAI_API_KEY")}
)
  1. 错误处理:捕获特定异常
try:
    response = client.completion(provider="openai", model="gpt-4", messages=[...])
except ValueError as e:
    print(f"配置错误: {e}")
  1. 模型切换:为不同场景选择合适的模型
# 快速响应场景
response = client.completion(provider="openai", model="gpt-4o-mini", messages=[...])

# 复杂推理场景
response = client.completion(provider="anthropic", model="claude-3-5-sonnet", messages=[...], reasoning_effort="high")

🤝 贡献

欢迎提交 Issue 和 Pull Request!

📄 开源协议

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

onellmclient-0.1.5.tar.gz (73.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

onellmclient-0.1.5-py3-none-any.whl (18.4 kB view details)

Uploaded Python 3

File details

Details for the file onellmclient-0.1.5.tar.gz.

File metadata

  • Download URL: onellmclient-0.1.5.tar.gz
  • Upload date:
  • Size: 73.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for onellmclient-0.1.5.tar.gz
Algorithm Hash digest
SHA256 06e0b347788d87ee83732f2161ed9b44e4f48f93f45c07daaca655aa3b3a306c
MD5 a56e49f0d3257643e00814d2eae58c9b
BLAKE2b-256 79995e66b38565aa5669fd8f2469ae557deee0ff827f8f3715191685545ef1a3

See more details on using hashes here.

File details

Details for the file onellmclient-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: onellmclient-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 18.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for onellmclient-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 6c0ccc92287142495159b4e6844358a90c5fceb6d52bd300648644537573c99b
MD5 e9dfc9fccd70a316fdc6e30562cb14a4
BLAKE2b-256 96ab42490958e8534e1505d0db4efdfd95ba722a8042793e654f29e37eb8af6c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page