Skip to main content

🚀 Openai api forward · OpenAI 接口转发服务 · streaming forward

Project description

中文 | English


OpenAI Forward

OpenAI API 接口转发服务
The fastest way to deploy openai api forwarding

PyPI version License Release (latest by date) GitHub repo size docer image size tests pypi downloads codecov

This project is designed to solve the problem of some regions being unable to directly access OpenAI. The service is deployed on a server that can access the OpenAI API, and OpenAI requests are forwarded through the service, i.e. a reverse proxy service is set up.

Test access: https://caloi.top/openai/v1/chat/completions is equivalent to https://api.openai.com/v1/chat/completions
Or, to put it another way, https://caloi.top/openai is equivalent to https://api.openai.com.

Table of Contents

Features

  • Supports forwarding of all OpenAI interfaces
  • Request IP verification
  • Streaming Response
  • Supports default API key (cyclic call with multiple API keys)
  • pip installation and deployment
  • Docker deployment
  • Support for multiple worker processes
  • Support for specifying the forwarding routing prefix

Usage

Here, the proxy address set up by the individual, https://caloi.top/openai, is used as an example

Using in a module

Python

  import openai
+ openai.api_base = "https://caloi.top/openai/v1"
  openai.api_key = "sk-******"

JS/TS

  import { Configuration } from "openai";
  
  const configuration = new Configuration({
+ basePath: "https://caloi.top/openai/v1",
  apiKey: "sk-******",
  });

Image Generation (DALL-E):

curl --location 'https://caloi.top/openai/v1/images/generations' \ 
--header 'Authorization: Bearer sk-******' \ 
--header 'Content-Type: application/json' \ 
--data '{ 
    "prompt": "A photo of a cat", 
    "n": 1, 
    "size": "512x512"
}' 

chatgpt-web

Modify the OPENAI_API_BASE_URL in Docker Compose to the address of the proxy service we set up:

OPENAI_API_BASE_URL: https://caloi.top/openai 

ChatGPT-Next-Web

Replace BASE_URL in the docker startup command with the address of the proxy service we set up:

docker run -d -p 3000:3000 -e OPENAI_API_KEY="sk-******" -e CODE="<your password>" -e BASE_URL="caloi.top/openai" yidadaa/chatgpt-next-web 

Deploy

Two deployment methods are provided, just choose one.

Use pip (recommended)

Installation

pip install openai-forward 

Run forwarding service The port number can be specified through --port, which defaults to 8000, and the number of worker processes can be specified through --workers, which defaults to 1.

openai_forward run --port=9999 --workers=1 

The service is now set up, and the usage is to replace https://api.openai.com with the port number of the service http://{ip}:{port}.

Of course, OPENAI_API_KEY can also be passed in as an environment variable as the default API key, so that the client does not need to pass in the Authorization in the header when requesting the relevant route. Startup command with default API key:

OPENAI_API_KEY="sk-xxx" openai_forward run --port=9999 --workers=1 

Note: If both the default API key and the API key passed in the request header exist, the API key in the request header will override the default API key.

Use Docker

docker run --name="openai-forward" -d -p 9999:8000 beidongjiedeguang/openai-forward:latest 

The 9999 port of the host is mapped, and the service can be accessed through http://{ip}:9999. Note: You can also pass in the environment variable OPENAI_API_KEY=sk-xxx as the default API key in the startup command.

Service Usage

Simply replace the OpenAI API address with the address of the service we set up, such as Chat Completion

https://api.openai.com/v1/chat/completions 

Replace with

http://{ip}:{port}/v1/chat/completions 

Configuration

openai-forward run Parameter Configuration Options

Configuration Option Description Default Value
--port Service port number 8000
--workers Number of worker processes 1

Environment Variable Configuration Options
refer to the .env file in the project root directory

Environment Variable Description Default Value
OPENAI_API_KEY Default API key, supports multiple default API keys separated by space. None
OPENAI_BASE_URL Forwarding base URL https://api.openai.com
LOG_CHAT Whether to log chat content true
ROUTE_PREFIX Route prefix None
IP_WHITELIST IP whitelist, separated by space. None
IP_BLACKLIST IP blacklist, separated by space. None

Chat Log

The saved path is in the Log/ directory under the current directory.
The chat log starts with chat_ and is written to the file every 5 rounds by default.
The recording format is as follows:

{'host': xxx, 'model': xxx, 'message': [{'user': xxx}, {'assistant': xxx}]}
{'assistant': xxx}

{'host': ...}
{'assistant': ...}

...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_forward-0.1.7.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_forward-0.1.7-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file openai_forward-0.1.7.tar.gz.

File metadata

  • Download URL: openai_forward-0.1.7.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.11

File hashes

Hashes for openai_forward-0.1.7.tar.gz
Algorithm Hash digest
SHA256 40553a092358a8791a742d492a6df7c08792c77c245159b60a03ad2ea0dcd733
MD5 59c9bf441c7ccfcb4564acf52e1c335a
BLAKE2b-256 c2b3f57c5de5e902d8623f3c565298e8106f91f19a2433b3a5318a6e63663b7a

See more details on using hashes here.

File details

Details for the file openai_forward-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: openai_forward-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 13.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.11

File hashes

Hashes for openai_forward-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 0220dbd5b065653650c3b730a9dfe09ad1a12886b70ceb06969e39d7d428416f
MD5 f35dac0523d9420c81ce3992c8f7b181
BLAKE2b-256 88b1a0012eb8e43700f28412eaf38d6192105d7d0cef5bbaf5fc52895ee9522e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page