FastAPI server that proxies OpenAI API endpoints using hexin_engine backend
Project description
Hexin Proxy Server
一个 FastAPI 服务器,提供 OpenAI 兼容的 API 接口,通过代理 Hexin 后端服务来提供 AI 功能。
功能特性
- Chat Completions API: 兼容 OpenAI 的聊天完成接口
- Embeddings API: 兼容 OpenAI 的文本嵌入接口
- 模型列表: 支持列出可用的 AI 模型
- 流式响应: 支持实时流式聊天响应
- 多模型支持: 支持多种大语言模型和嵌入模型
支持的接口
Chat Completions
POST /v1/chat/completions- 创建聊天完成- 支持流式和非流式响应
- 支持工具调用和函数调用
- 支持多种模型:GPT、Claude、Gemini、DeepSeek 等
Embeddings
POST /v1/embeddings- 创建文本嵌入- 支持的模型:text-embedding-ada-002, text-embedding-3-small, text-embedding-3-large
- 支持单个和批量文本处理
Models
GET /v1/models- 列出可用模型- 返回聊天和嵌入模型列表
快速开始
1. 安装依赖
pip install -r requirements.txt
# 或者
pip install fastapi uvicorn python-dotenv loguru requests openai pydantic
2. 配置环境变量
创建 .env 文件:
HITHINK_APP_ID=your_app_id
HITHINK_APP_SECRET=your_app_secret
HITHINK_APP_URL=your_app_url
3. 启动服务器
# 直接运行
python -m hexin_server
# 或者指定参数
python -m hexin_server --host 0.0.0.0 --port 8777 --reload
4. 测试接口
Chat Completions 示例
curl -X POST "http://localhost:8777/v1/chat/completions" \
-H "Authorization: Bearer sk-fastapi-proxy-key-12345" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'
Embeddings 示例
curl -X POST "http://localhost:8777/v1/embeddings" \
-H "Authorization: Bearer sk-fastapi-proxy-key-12345" \
-H "Content-Type: application/json" \
-d '{
"input": "Hello, world!",
"model": "text-embedding-ada-002"
}'
使用 OpenAI 客户端库
import openai
# 配置客户端
client = openai.OpenAI(
api_key="sk-fastapi-proxy-key-12345",
base_url="http://localhost:8777/v1"
)
# 聊天完成
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
# 创建嵌入
embeddings = client.embeddings.create(
model="text-embedding-ada-002",
input="Hello, world!"
)
详细文档
- Embedding API 使用指南 - 详细的嵌入接口文档
项目结构
hexin-proxy-server/
├── hexin_server/
│ ├── __init__.py
│ └── __main__.py # 主服务器代码
├── tests/
│ └── test_embedding.py # 嵌入接口测试脚本
├── EMBEDDING_API.md # 嵌入API使用指南
├── README.md
├── requirements.txt
└── .env.example
测试
项目包含测试脚本来验证功能:
# 测试嵌入接口
python tests/test_embedding.py
贡献
欢迎提交 Issue 和 Pull Request!
许可证
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
hexin_server-0.1.2.tar.gz
(14.7 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hexin_server-0.1.2.tar.gz.
File metadata
- Download URL: hexin_server-0.1.2.tar.gz
- Upload date:
- Size: 14.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.1 CPython/3.10.9 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
173716ccf58c969be256c8f9a88a1e8b3bc4a3b382650481abb1a0ad278071a8
|
|
| MD5 |
a0f01babaf77d6e931c846c6c3144afb
|
|
| BLAKE2b-256 |
b1430ad2c0c009b7417abedfa9abea935c46cd6bc52e90995f6bf2dcada2a504
|
File details
Details for the file hexin_server-0.1.2-py3-none-any.whl.
File metadata
- Download URL: hexin_server-0.1.2-py3-none-any.whl
- Upload date:
- Size: 14.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.1 CPython/3.10.9 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0625e6a474f3bf209077c8b524ed11789d2250ea8be84b61db2237538c5e7635
|
|
| MD5 |
9c46611292ed90818f0f1b01c09a1b6d
|
|
| BLAKE2b-256 |
198fc77840ff5c8b67a54897a49c9a57a6d1220438351b3171a85eac8df474a6
|