🚀 Openai api forward · OpenAI 接口转发服务 · streaming forward
Project description
中文 | English
OpenAI Forward
OpenAI API 接口转发服务
The fastest way to deploy openai api forwarding
Test access: https://caloi.top/openai/v1/chat/completions is equivalent to https://api.openai.com/v1/chat/completions
Or, to put it another way, https://caloi.top/openai is equivalent to https://api.openai.com.
Table of Contents
Features
- Supports forwarding of all OpenAI interfaces
- Request IP verification
- Streaming Response
- Supports default API key (cyclic call with multiple API keys)
- pip installation and deployment
- Docker deployment
- Support for multiple worker processes
- Support for specifying the forwarding routing prefix
Usage
Here, the proxy address set up by the individual, https://caloi.top/openai, is used as an example
Using in a module
Python
import openai
+ openai.api_base = "https://caloi.top/openai/v1"
openai.api_key = "sk-******"
JS/TS
import { Configuration } from "openai";
const configuration = new Configuration({
+ basePath: "https://caloi.top/openai/v1",
apiKey: "sk-******",
});
Image Generation (DALL-E):
curl --location 'https://caloi.top/openai/v1/images/generations' \
--header 'Authorization: Bearer sk-******' \
--header 'Content-Type: application/json' \
--data '{
"prompt": "A photo of a cat",
"n": 1,
"size": "512x512"
}'
chatgpt-web
Modify the OPENAI_API_BASE_URL
in Docker Compose to the
address of the proxy service we set up:
OPENAI_API_BASE_URL: https://caloi.top/openai
ChatGPT-Next-Web
Replace BASE_URL
in the docker startup command with the address of the proxy service we set up:
docker run -d -p 3000:3000 -e OPENAI_API_KEY="sk-******" -e CODE="<your password>" -e BASE_URL="caloi.top/openai" yidadaa/chatgpt-next-web
Deploy
Two deployment methods are provided, just choose one.
Use pip
(recommended)
Installation
pip install openai-forward
Run forwarding service
The port number can be specified through --port
, which defaults to 8000
, and the number of worker processes can be
specified through --workers
, which defaults to 1
.
openai_forward run --port=9999 --workers=1
The service is now set up, and the usage is to replace https://api.openai.com
with the port number of the
service http://{ip}:{port}
.
Of course, OPENAI_API_KEY can also be passed in as an environment variable as the default API key, so that the client does not need to pass in the Authorization in the header when requesting the relevant route. Startup command with default API key:
OPENAI_API_KEY="sk-xxx" openai_forward run --port=9999 --workers=1
Note: If both the default API key and the API key passed in the request header exist, the API key in the request header will override the default API key.
Use Docker
docker run --name="openai-forward" -d -p 9999:8000 beidongjiedeguang/openai-forward:latest
The 9999 port of the host is mapped, and the service can be accessed through http://{ip}:9999
.
Note: You can also pass in the environment variable OPENAI_API_KEY=sk-xxx as the default API key in the startup command.
Service Usage
Simply replace the OpenAI API address with the address of the service we set up, such as Chat Completion
https://api.openai.com/v1/chat/completions
Replace with
http://{ip}:{port}/v1/chat/completions
Configuration
openai-forward run
Parameter Configuration Options
Configuration Option | Description | Default Value |
---|---|---|
--port | Service port number | 8000 |
--workers | Number of worker processes | 1 |
Environment Variable Configuration Options
refer to the .env
file in the project root directory
Environment Variable | Description | Default Value |
---|---|---|
OPENAI_API_KEY | Default API key, supports multiple default API keys separated by space. | None |
OPENAI_BASE_URL | Forwarding base URL | https://api.openai.com |
LOG_CHAT | Whether to log chat content | true |
ROUTE_PREFIX | Route prefix | None |
IP_WHITELIST | IP whitelist, separated by space. | None |
IP_BLACKLIST | IP blacklist, separated by space. | None |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for openai_forward-0.1.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 19e7ba1bf5e3bbd2b6f50361bc882e35356d4fc1708fe1e644e16855ffb695a5 |
|
MD5 | 2da227c702c685265c936ebe1eedbf77 |
|
BLAKE2b-256 | 1289009cbf674f75a7e6119e13dedb138f56cbb84ad3dc63f80adf500a8ef936 |