🚀 Openai api forward · OpenAI 接口转发服务 · streaming forward
Project description
中文 | English
OpenAI Forward
OpenAI API 接口转发服务
The fastest way to deploy openai api forward proxy
Test access: https://caloi.top/v1/chat/completions is equivalent to https://api.openai.com/v1/chat/completions
Table of Contents
Features
- Supports forwarding of all OpenAI interfaces
- Supports request IP verification
- Supports streaming forwarding
- Supports default API key
- pip installation and deployment
- Docker deployment
- Support for multiple worker processes
- Support for specifying the forwarding routing prefix.
- Chat content security: Chat content streaming filtering
Usage
Here, the proxy address set up by the individual, https://caloi.top, is used as an example
Using in a module
JS/TS
import { Configuration } from "openai";
const configuration = new Configuration({
+ basePath: "https://caloi.top",
apiKey: "sk-******",
});
Python
import openai
+ openai.api_base = "https://caloi.top"
openai.api_key = "sk-******"
Image Generation (DALL-E):
curl --location 'https://caloi.top/v1/images/generations' \
--header 'Authorization: Bearer sk-******' \
--header 'Content-Type: application/json' \
--data '{
"prompt": "A photo of a cat",
"n": 1,
"size": "512x512"
}'
chatgpt-web
Modify the OPENAI_API_BASE_URL
in Docker Compose to the address of the proxy service we set up:
OPENAI_API_BASE_URL: https://caloi.top
ChatGPT-Next-Web
Replace BASE_URL
in the docker startup command with the address of the proxy service we set up:
docker run -d -p 3000:3000 -e OPENAI_API_KEY="sk-******" -e CODE="<your password>" -e BASE_URL="caloi.top" yidadaa/chatgpt-next-web
Service Deployment
Two service deployment methods are provided, choose one
Use pip
Installation
pip install openai-forward
Run forwarding service
The port number can be specified through --port
, which defaults to 8000
, and the number of worker processes can be specified through --workers
, which defaults to 1
.
openai_forward run --port=9999 --workers=1
The service is now set up, and the usage is to replace https://api.openai.com
with the port number of the service http://{ip}:{port}
.
Of course, OPENAI_API_KEY can also be passed in as an environment variable as the default API key, so that the client does not need to pass in the Authorization in the header when requesting the relevant route. Startup command with default API key:
OPENAI_API_KEY="sk-xxx" openai_forward run --port=9999 --workers=1
Note: If both the default API key and the API key passed in the request header exist, the API key in the request header will override the default API key.
Use Docker (recommended)
docker run --name="openai-forward" -d -p 9999:8000 beidongjiedeguang/openai-forward:latest
The 9999 port of the host is mapped, and the service can be accessed through http://{ip}:9999
.
Note: You can also pass in the environment variable OPENAI_API_KEY=sk-xxx as the default API key in the startup command.
Service Usage
Simply replace the OpenAI API address with the address of the service we set up, such as:
https://api.openai.com/v1/chat/completions
Replace with
http://{ip}:{port}/v1/chat/completions
Configuration
openai-forward run
Parameter Configuration Options
Configuration Option | Description | Default Value |
---|---|---|
--port | Service port number | 8000 |
--workers | Number of worker processes | 1 |
Environment Variable Configuration Options
refer to the .env
file in the project root directory
Environment Variable | Description | Default Value |
---|---|---|
OPENAI_API_KEY | Default API key | None |
OPENAI_BASE_URL | Forwarding base URL | https://api.openai.com |
LOG_CHAT | Whether to log chat content | true |
ROUTE_PREFIX | Route prefix | None |
TODO
Environment Variable | Description | Default Value |
---|---|---|
IP_WHITELIST | IP whitelist | None |
IP_BLACKLIST | IP blacklist | None |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for openai_forward-0.1.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f28f14b315a3d252f4d43572c87a0c04e6f2138ac159923289e535740b733438 |
|
MD5 | caf46e063ccf13ba1efbedb4ac7d1180 |
|
BLAKE2b-256 | 46702a44593dd37e3f220d5ffe6bde2759e40bc1853c7e35b78e808bcdf3ff14 |