Skip to main content

Proxy server to Argo API, OpenAI format compatible

Project description

argo-openai-proxy

This project is a proxy application that forwards requests to an ARGO API and optionally converts the responses to be compatible with OpenAI's API format. It can be used in conjunction with autossh-tunnel-dockerized or other secure connection tools.

NOTICE OF USAGE

The machine or server making API calls to Argo must be connected to the Argonne internal network or through a VPN on an Argonne-managed computer if you are working off-site. Your instance of the argo proxy should always be on-premise at an Argonne machine. The software is provided "as is," without any warranties. By using this software, you accept that the authors, contributors, and affiliated organizations will not be liable for any damages or issues arising from its use. You are solely responsible for ensuring the software meets your requirements.

Deployment

Prerequisites

  • Python 3.10+ is required
    recommend to use conda/mamba or pipx etc to manage exclusive environment
    Conda/Mamba Download and install from: https://conda-forge.org/download/

  • Install dependencies:

    pip install .
    

Configuration File

If you don't want to bother manually configure it, the First-Time Setup will automatically create it for you.

The application uses config.yaml for configuration. Here's an example:

port: 44497
host: 0.0.0.0
argo_url: "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/chat/"
argo_stream_url: "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/streamchat/"
argo_embedding_url: "https://apps.inside.anl.gov/argoapi/api/v1/resource/embed/"
user: "your_username" # set during first-time setup
verbose: true # can be changed during setup
num_workers: 5
timeout: 600 # in seconds

Running the Application

To start the application:

argo-proxy [config_path]
  • Without arguments: search for config.yaml under ~/.config/argoproxy/, ~/.argoproxy/, or current directory

  • With path: uses specified config file

    argo-proxy /path/to/config.yaml
    

First-Time Setup

When running without an existing config file:

  1. The script offers to create config.yaml from config.sample.yaml
  2. Automatically selects a random available port (can be overridden)
  3. Prompts for:
    • Your username (sets user field)
    • Verbose mode preference (sets verbose field)
  4. Validates connectivity to configured URLs
  5. Shows the generated config in a formatted display for review before proceeding

Example session:

$ argo-proxy 
No valid configuration found.
Would you like to create it from config.sample.yaml? [Y/n]: 
Creating new configuration...
Use port [52226]? [Y/n/<port>]: 
Enter your username: your_username
Enable verbose mode? [Y/n] 
Set timeout to [600] seconds? [Y/n/<timeout>] 
Created new configuration at: /home/your_username/.config/argoproxy/config.yaml
Using port 52226...
Validating URL connectivity...
Current configuration:
--------------------------------------
{
    "host": "0.0.0.0",
    "port": 52226,
    "user": "your_username",
    "argo_url": "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/chat/",
    "argo_stream_url": "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/streamchat/",
    "argo_embedding_url": "https://apps.inside.anl.gov/argoapi/api/v1/resource/embed/",
    "verbose": true,
    "num_workers": 5,
    "timeout": 600
}
--------------------------------------
# ... proxy server starting info display ...

Configuration Options Reference

Option Description Default
host Host address to bind the server to 0.0.0.0
port Application port (random available port selected by default) randomly assigned
argo_url ARGO chat API URL Dev URL (for now)
argo_stream_url ARGO stream API URL Dev URL (for now)
argo_embedding_url ARGO embedding API URL Prod URL
user Your username (Set during setup)
verbose Debug logging true
num_workers Worker processes 5
timeout Request timeout (seconds) 600

argo-proxy Cli Available Options

$ argo-proxy -h
usage: argo-proxy [-h] [--show] [--host HOST] [--port PORT] [--num-worker NUM_WORKER]
                  [--verbose | --quiet] [--version]
                  [config]

Argo Proxy CLI

positional arguments:
  config                Path to the configuration file

options:
  -h, --help            show this help message and exit
  --show, -s            Show the current configuration during launch
  --host HOST, -H HOST  Host address to bind the server to
  --port PORT, -p PORT  Port number to bind the server to
  --num-worker NUM_WORKER, -n NUM_WORKER
                        Number of worker processes to run
  --verbose, -v         Enable verbose logging, override if `verbose` set False in config
  --quiet, -q           Disable verbose logging, override if `verbose` set True in config
  --version, -V         Show the version and exit.

Usage

Endpoints

OpenAI Compatible

These endpoints convert responses from the ARGO API to be compatible with OpenAI's format:

  • /v1/chat/completions: Converts ARGO chat/completions responses to OpenAI-compatible format.
  • /v1/completions: Legacy API for conversions to OpenAI format.
  • /v1/embeddings: Accesses ARGO Embedding API with response conversion.
  • /v1/models: Lists available models in OpenAI-compatible format.

Not OpenAI Compatible

These endpoints interact directly with the ARGO API and do not convert responses to OpenAI's format:

  • /v1/chat: Proxies requests to the ARGO API without conversion.
  • /v1/status: Responds with a simple "hello" from GPT-4o, knowing it is alive.

Timeout Override

You can override the default timeout with a timeout parameter in your request.

Details of how to make such override in different query flavors: Timeout Override Examples

Models

Chat Models

Original ARGO Model Name Argo Proxy Name
gpt35 argo:gpt-3.5-turbo
gpt35large argo:gpt-3.5-turbo-16k
gpt4 argo:gpt-4
gpt4large argo:gpt-4-32k
gpt4turbo argo:gpt-4-turbo-preview
gpt4o argo:gpt-4o
gpt4olatest argo:gpt-4o-latest
gpto1preview argo:gpt-o1-preview, argo:o1-preview
gpto1mini argo:gpt-o1-mini , argo:o1-mini
gpto3mini argo:gpt-o3-mini , argo:o3-mini
gpto1 argo:gpt-o1 , argo:o1

Embedding Models

Original ARGO Model Name Argo Proxy Name
ada002 argo:text-embedding-ada-002
v3small argo:text-embedding-3-small
v3large argo:text-embedding-3-large

Examples

Chat Completion Example

For an example of how to use the /v1/chat/completions, /v1/completions, /v1/chat endpoint, see the followings:

Embedding Example

o1 Chat Example

OpenAI Client Example

Folder Structure

The following is an overview of the project's directory structure:

$ tree -I "__pycache__|*.egg-info|dist|dev_scripts|config.yaml"
.
├── config.sample.yaml
├── examples
│   ├── chat_completions_example.py
│   ├── chat_completions_example_stream.py
│   ├── chat_example.py
│   ├── chat_example_stream.py
│   ├── completions_example.py
│   ├── completions_example_stream.py
│   ├── embedding_example.py
│   ├── o1_chat_example.py
│   └── o3_chat_example_pyclient.py
├── LICENSE
├── Makefile
├── pyproject.toml
├── README.md
├── run_app.sh
├── src
│   └── argoproxy
│       ├── app.py
│       ├── chat.py
│       ├── cli.py
│       ├── completions.py
│       ├── config.py
│       ├── constants.py
│       ├── embed.py
│       ├── extras.py
│       ├── __init__.py
│       ├── py.typed
│       └── utils.py
└── timeout_examples.md

4 directories, 27 files

Bug Reports and Contributions

This project was developed in my spare time. Bugs and issues may exist. If you encounter any or have suggestions for improvements, please open an issue or submit a pull request. Your contributions are highly appreciated!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

argo_proxy-2.5.0.tar.gz (19.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

argo_proxy-2.5.0-py3-none-any.whl (24.5 kB view details)

Uploaded Python 3

File details

Details for the file argo_proxy-2.5.0.tar.gz.

File metadata

  • Download URL: argo_proxy-2.5.0.tar.gz
  • Upload date:
  • Size: 19.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.17

File hashes

Hashes for argo_proxy-2.5.0.tar.gz
Algorithm Hash digest
SHA256 46ea6ca16484f4044d5fae21fdc2fef30f858c6565eff3ee4bc9b351935abcc2
MD5 1cb8f248612fc549365d563a559171b0
BLAKE2b-256 a86416f7562303fa5f930af566c45f45f82527e893161ce9a157fecd8be10a4c

See more details on using hashes here.

File details

Details for the file argo_proxy-2.5.0-py3-none-any.whl.

File metadata

  • Download URL: argo_proxy-2.5.0-py3-none-any.whl
  • Upload date:
  • Size: 24.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.17

File hashes

Hashes for argo_proxy-2.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 91b49ccf7633fdc1c22ca762650b5feefbcab2f8b4a519c1db98a0f7d3ab9692
MD5 5f91cc3bad535ee4b1ae4688d1ae82e1
BLAKE2b-256 349482d8ac948ec75cce113424659620585818e19cb3a901bdcb126c717d2d2a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page