FastAPI server that proxies OpenAI API endpoints using hexin_engine backend
Project description
Hexin Proxy Server
一个 FastAPI 服务器,提供 OpenAI 兼容的 API 接口,通过代理 Hexin 后端服务来提供 AI 功能。
功能特性
- Chat Completions API: 兼容 OpenAI 的聊天完成接口
- Embeddings API: 兼容 OpenAI 的文本嵌入接口
- 模型列表: 支持列出可用的 AI 模型
- 流式响应: 支持实时流式聊天响应
- 多模型支持: 支持多种大语言模型和嵌入模型
支持的接口
Chat Completions
POST /v1/chat/completions- 创建聊天完成- 支持流式和非流式响应
- 支持工具调用和函数调用
- 支持多种模型:GPT、Claude、Gemini、DeepSeek 等
Embeddings
POST /v1/embeddings- 创建文本嵌入- 支持的模型:text-embedding-ada-002, text-embedding-3-small, text-embedding-3-large
- 支持单个和批量文本处理
Models
GET /v1/models- 列出可用模型- 返回聊天和嵌入模型列表
快速开始
1. 安装依赖
pip install -r requirements.txt
# 或者
pip install fastapi uvicorn python-dotenv loguru requests openai pydantic
2. 配置环境变量
创建 .env 文件:
HITHINK_APP_ID=your_app_id
HITHINK_APP_SECRET=your_app_secret
HITHINK_APP_URL=your_app_url
3. 启动服务器
# 直接运行
python -m hexin_server
# 或者指定参数
python -m hexin_server --host 0.0.0.0 --port 8777 --reload
4. 测试接口
Chat Completions 示例
curl -X POST "http://localhost:8777/v1/chat/completions" \
-H "Authorization: Bearer sk-fastapi-proxy-key-12345" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'
Embeddings 示例
curl -X POST "http://localhost:8777/v1/embeddings" \
-H "Authorization: Bearer sk-fastapi-proxy-key-12345" \
-H "Content-Type: application/json" \
-d '{
"input": "Hello, world!",
"model": "text-embedding-ada-002"
}'
使用 OpenAI 客户端库
import openai
# 配置客户端
client = openai.OpenAI(
api_key="sk-fastapi-proxy-key-12345",
base_url="http://localhost:8777/v1"
)
# 聊天完成
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
# 创建嵌入
embeddings = client.embeddings.create(
model="text-embedding-ada-002",
input="Hello, world!"
)
详细文档
- Embedding API 使用指南 - 详细的嵌入接口文档
项目结构
hexin-proxy-server/
├── hexin_server/
│ ├── __init__.py
│ └── __main__.py # 主服务器代码
├── tests/
│ └── test_embedding.py # 嵌入接口测试脚本
├── EMBEDDING_API.md # 嵌入API使用指南
├── README.md
├── requirements.txt
└── .env.example
测试
项目包含测试脚本来验证功能:
# 测试嵌入接口
python tests/test_embedding.py
贡献
欢迎提交 Issue 和 Pull Request!
许可证
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
hexin_server-0.1.4.tar.gz
(14.8 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hexin_server-0.1.4.tar.gz.
File metadata
- Download URL: hexin_server-0.1.4.tar.gz
- Upload date:
- Size: 14.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.1 CPython/3.10.9 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
183ba2ef4fac7a20f7f5ccb6f4c0dcfaccf972ef9abbdad285e2d01df4029f19
|
|
| MD5 |
049aeb84f6f686ea0968d969a27cf41a
|
|
| BLAKE2b-256 |
31483ccc23afa3127ecd26d547f9a499a28e8ed5b9399dedb84e7a9df299ca75
|
File details
Details for the file hexin_server-0.1.4-py3-none-any.whl.
File metadata
- Download URL: hexin_server-0.1.4-py3-none-any.whl
- Upload date:
- Size: 14.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.1 CPython/3.10.9 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1338fea068da5ac4bf6a7e7f7daadae736732b400c8fbb9a35b134473ba95d05
|
|
| MD5 |
eaf142096431c5255a56f86f2c979a68
|
|
| BLAKE2b-256 |
bc71639a3caa1db38fcb4402d306414d305b9e4f931134ec904f4823133664dd
|