Skip to main content

🚀 Openai api forward · OpenAI 接口转发服务 · streaming forward

Project description

中文 | English


OpenAI Forward

OpenAI API 接口转发服务
The fastest way to deploy openai api forwarding

PyPI version License Release (latest by date) GitHub repo size docer image size tests pypi downloads

This project is designed to solve the problem of some regions being unable to directly access OpenAI. The service is deployed on a server that can access the OpenAI API, and OpenAI requests are forwarded through the service, i.e. a reverse proxy service is set up.

Test access: https://caloi.top/v1/chat/completions is equivalent to https://api.openai.com/v1/chat/completions

Table of Contents

Features

  • Supports forwarding of all OpenAI interfaces
  • Request IP verification
  • Streaming Response
  • Supports default API key (cyclic call with multiple API keys)
  • pip installation and deployment
  • Docker deployment
  • Support for multiple worker processes
  • Support for specifying the forwarding routing prefix

Usage

Here, the proxy address set up by the individual, https://caloi.top, is used as an example

Using in a module

Python

  import openai
+ openai.api_base = "https://caloi.top/v1"
  openai.api_key = "sk-******"

JS/TS

  import { Configuration } from "openai";
  
  const configuration = new Configuration({
+ basePath: "https://caloi.top/v1",
  apiKey: "sk-******",
  });

Image Generation (DALL-E):

curl --location 'https://caloi.top/v1/images/generations' \ 
--header 'Authorization: Bearer sk-******' \ 
--header 'Content-Type: application/json' \ 
--data '{ 
    "prompt": "A photo of a cat", 
    "n": 1, 
    "size": "512x512"
}' 

chatgpt-web

Modify the OPENAI_API_BASE_URL in Docker Compose to the address of the proxy service we set up:

OPENAI_API_BASE_URL: https://caloi.top 

ChatGPT-Next-Web

Replace BASE_URL in the docker startup command with the address of the proxy service we set up:

docker run -d -p 3000:3000 -e OPENAI_API_KEY="sk-******" -e CODE="<your password>" -e BASE_URL="caloi.top" yidadaa/chatgpt-next-web 

Service Deployment

Two service deployment methods are provided, choose one

Use pip (recommended)

Installation

pip install openai-forward 

Run forwarding service The port number can be specified through --port, which defaults to 8000, and the number of worker processes can be specified through --workers, which defaults to 1.

openai_forward run --port=9999 --workers=1 

The service is now set up, and the usage is to replace https://api.openai.com with the port number of the service http://{ip}:{port}.

Of course, OPENAI_API_KEY can also be passed in as an environment variable as the default API key, so that the client does not need to pass in the Authorization in the header when requesting the relevant route. Startup command with default API key:

OPENAI_API_KEY="sk-xxx" openai_forward run --port=9999 --workers=1 

Note: If both the default API key and the API key passed in the request header exist, the API key in the request header will override the default API key.

Use Docker

docker run --name="openai-forward" -d -p 9999:8000 beidongjiedeguang/openai-forward:latest 

The 9999 port of the host is mapped, and the service can be accessed through http://{ip}:9999. Note: You can also pass in the environment variable OPENAI_API_KEY=sk-xxx as the default API key in the startup command.

Service Usage

Simply replace the OpenAI API address with the address of the service we set up, such as Chat Completion

https://api.openai.com/v1/chat/completions 

Replace with

http://{ip}:{port}/v1/chat/completions 

Configuration

openai-forward run Parameter Configuration Options

Configuration Option Description Default Value
--port Service port number 8000
--workers Number of worker processes 1

Environment Variable Configuration Options
refer to the .env file in the project root directory

Environment Variable Description Default Value
OPENAI_API_KEY Default API key, supports multiple default API keys separated by space. None
OPENAI_BASE_URL Forwarding base URL https://api.openai.com
LOG_CHAT Whether to log chat content true
ROUTE_PREFIX Route prefix None
IP_WHITELIST IP whitelist, separated by space. None
IP_BLACKLIST IP blacklist, separated by space. None

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_forward-0.1.4.tar.gz (10.0 kB view hashes)

Uploaded Source

Built Distribution

openai_forward-0.1.4-py3-none-any.whl (12.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page