Proxy server to Argo API, OpenAI format compatible
Project description
argo-openai-proxy
This project is a proxy application that forwards requests to an ARGO API and optionally converts the responses to be compatible with OpenAI's API format. It can be used in conjunction with autossh-tunnel-dockerized or other secure connection tools.
TL;DR
pip install argo-proxy # install the package
argo-proxy # run the proxy
NOTICE OF USAGE
The machine or server making API calls to Argo must be connected to the Argonne internal network or through a VPN on an Argonne-managed computer if you are working off-site. Your instance of the argo proxy should always be on-premise at an Argonne machine. The software is provided "as is," without any warranties. By using this software, you accept that the authors, contributors, and affiliated organizations will not be liable for any damages or issues arising from its use. You are solely responsible for ensuring the software meets your requirements.
Deployment
Prerequisites
-
Python 3.10+ is required.
It is recommended to use conda, mamba, or pipx, etc., to manage an exclusive environment.
Conda/Mamba Download and install from: https://conda-forge.org/download/
pipx Download and install from: https://pipx.pypa.io/stable/installation/ -
Install dependencies:
PyPI current version:
pip install argo-proxy
To upgrade:
argo-proxy --version # Display current version # Check against PyPI version pip install argo-proxy --upgrade
or, if you decide to use dev version (make sure you are at the root of the repo cloned):
pip install .
Configuration File
If you don't want to manually configure it, the First-Time Setup will automatically create it for you.
The application uses config.yaml for configuration. Here's an example:
port: 44497
host: 0.0.0.0
argo_url: "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/chat/"
argo_stream_url: "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/streamchat/"
argo_embedding_url: "https://apps.inside.anl.gov/argoapi/api/v1/resource/embed/"
user: "your_username" # set during first-time setup
verbose: true # can be changed during setup
Running the Application
To start the application:
argo-proxy [config_path]
-
Without arguments: search for
config.yamlunder:- current directory
~/.config/argoproxy/~/.argoproxy/The first one found will be used.
-
With path: uses specified config file, if exists. Otherwise, falls back to default search.
argo-proxy /path/to/config.yaml -
With
--editflag: opens the config file in the default editor for modification.
First-Time Setup
When running without an existing config file:
- The script offers to create
config.yamlfromconfig.sample.yaml - Automatically selects a random available port (can be overridden)
- Prompts for:
- Your username (sets
userfield) - Verbose mode preference (sets
verbosefield)
- Your username (sets
- Validates connectivity to configured URLs
- Shows the generated config in a formatted display for review before proceeding
Example session:
$ argo-proxy
No valid configuration found.
Would you like to create it from config.sample.yaml? [Y/n]:
Creating new configuration...
Use port [52226]? [Y/n/<port>]:
Enter your username: your_username
Enable verbose mode? [Y/n]
Created new configuration at: /home/your_username/.config/argoproxy/config.yaml
Using port 52226...
Validating URL connectivity...
Current configuration:
--------------------------------------
{
"host": "0.0.0.0",
"port": 52226,
"user": "your_username",
"argo_url": "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/chat/",
"argo_stream_url": "https://apps-dev.inside.anl.gov/argoapi/api/v1/resource/streamchat/",
"argo_embedding_url": "https://apps.inside.anl.gov/argoapi/api/v1/resource/embed/",
"verbose": true
}
--------------------------------------
# ... proxy server starting info display ...
Configuration Options Reference
| Option | Description | Default |
|---|---|---|
host |
Host address to bind the server to | 0.0.0.0 |
port |
Application port (random available port selected by default) | randomly assigned |
argo_url |
Argo Chat API URL | Dev URL (for now) |
argo_stream_url |
Argo Stream API URL | Dev URL (for now) |
argo_embedding_url |
Argo Embedding API URL | Prod URL |
user |
Your username | (Set during setup) |
verbose |
Debug logging | true |
argo-proxy CLI Available Options
$ argo-proxy -h
usage: argo-proxy [-h] [--host HOST] [--port PORT] [--verbose | --quiet] [--edit]
[--validate] [--show] [--version]
[config]
Argo Proxy CLI
positional arguments:
config Path to the configuration file
options:
-h, --help show this help message and exit
--host HOST, -H HOST Host address to bind the server to
--port PORT, -p PORT Port number to bind the server to
--verbose, -v Enable verbose logging, override if `verbose` set False in config
--quiet, -q Disable verbose logging, override if `verbose` set True in config
--edit, -e Open the configuration file in the system's default editor for
editing
--validate, -vv Validate the configuration file and exit
--show, -s Show the current configuration during launch
--version, -V Show the version and exit.
Management Utilities
The following options help manage the configuration file:
-
--edit, -e: Open the configuration file in the system's default editor for editing.- If no config file is specified, it will search in default locations (~/.config/argoproxy/, ~/.argoproxy/, or current directory)
- Tries common editors like nano, vi, vim (unix-like systems) or notepad (Windows)
-
--validate, -vv: Validate the configuration file and exit without starting the server.- Useful for checking config syntax and connectivity before deployment
-
--show, -s: Show the current configuration during launch.- Displays the fully resolved configuration including defaults
- Can be used with
--validateto just display configuration without starting the server
# Example usage:
argo-proxy --edit # Edit config file
argo-proxy --validate --show # Validate and display config
argo-proxy --show # Show config at startup
Usage
Endpoints
OpenAI Compatible
These endpoints convert responses from the ARGO API to be compatible with OpenAI's format:
/v1/responses: Available from v2.7.0. Response API./v1/chat/completions: Chat Completions API./v1/completions: Legacy Completions API./v1/embeddings: Embedding API./v1/models: Lists available models in OpenAI-compatible format.
Not OpenAI Compatible
These endpoints interact directly with the ARGO API and do not convert responses to OpenAI's format:
/v1/chat: Proxies requests to the ARGO API without conversion./v1/embed: Proxies requests to the ARGO Embedding API without conversion.
Utility Endpoints
/health: Health check endpoint. Returns200 OKif the server is running./version: Returns the version of the ArgoProxy server. Notifies if a new version is available. Available from 2.7.0.post1.
Timeout Override
You can override the default timeout with a timeout parameter in your request. This parameter is optional for client request. Proxy server will keep the connection open until it finishes or client disconnects.
Details of how to make such override in different query flavors: Timeout Override Examples
Models
Chat Models
| Original ARGO Model Name | Argo Proxy Name |
|---|---|
gpt35 |
argo:gpt-3.5-turbo |
gpt35large |
argo:gpt-3.5-turbo-16k |
gpt4 |
argo:gpt-4 |
gpt4large |
argo:gpt-4-32k |
gpt4turbo |
argo:gpt-4-turbo |
gpt4o |
argo:gpt-4o |
gpt4olatest |
argo:gpt-4o-latest |
gpto1preview |
argo:gpt-o1-preview, argo:o1-preview |
gpto1mini |
argo:gpt-o1-mini , argo:o1-mini |
gpto3mini |
argo:gpt-o3-mini , argo:o3-mini |
gpto1 |
argo:gpt-o1 , argo:o1 |
Embedding Models
| Original ARGO Model Name | Argo Proxy Name |
|---|---|
ada002 |
argo:text-embedding-ada-002 |
v3small |
argo:text-embedding-3-small |
v3large |
argo:text-embedding-3-large |
Examples
Raw Requests
For examples of how to use the raw request utilities (e.g., httpx, requests), refer to:
Direct Access to ARGO
- Direct Chat Example: argo_chat.py
- Direct Chat Stream Example: argo_chat_stream.py
- Direct Embedding Example: argo_embed.py
OpenAI Compatible Requests
- Chat Completions Example: chat_completions.py
- Chat Completions Stream Example: chat_completions_stream.py
- Legacy Completions Example: legacy_completions.py
- Legacy Completions Stream Example: legacy_completions_stream.py
- Responses Example: responses.py
- Responses Stream Example: responses_stream.py
- Embedding Example: embedding.py
- o1 Mini Chat Completions Example: o1_mini_chat_completions.py
OpenAI Client
For examples demonstrating the use case of the OpenAI client (openai.OpenAI), refer to:
- Chat Completions Example: chat_completions.py
- Chat Completions Stream Example: chat_completions_stream.py
- Legacy Completions Example: legacy_completions.py
- Legacy Completions Stream Example: legacy_completions_stream.py
- Responses Example: responses.py
- Responses Stream Example: responses_stream.py
- Embedding Example: embedding.py
- O3 Mini Simple Chatbot Example: o3_mini_simple_chatbot.py
Folder Structure
The following is an overview of the project's directory structure:
$ tree -I "__pycache__|*.egg-info|dist|dev_scripts|config.yaml"
.
├── config.sample.yaml
├── examples
│ ├── openai_client
│ │ ├── chat_completions.py
│ │ ├── chat_completions_stream.py
│ │ ├── embedding.py
│ │ ├── legacy_completions.py
│ │ ├── legacy_completions_stream.py
│ │ ├── o3_mini_simple_chatbot.py
│ │ ├── responses.py
│ │ └── responses_stream.py
│ └── raw_requests
│ ├── argo_chat.py
│ ├── argo_chat_stream.py
│ ├── argo_embed.py
│ ├── chat_completions.py
│ ├── chat_completions_stream.py
│ ├── embedding.py
│ ├── legacy_completions.py
│ ├── legacy_completions_stream.py
│ ├── o1_mini_chat_completions.py
│ ├── responses.py
│ └── responses_stream.py
├── LICENSE
├── Makefile
├── pyproject.toml
├── README.md
├── run_app.sh
├── src
│ └── argoproxy
│ ├── app.py
│ ├── cli.py
│ ├── config.py
│ ├── constants.py
│ ├── endpoints
│ │ ├── chat.py
│ │ ├── completions.py
│ │ ├── embed.py
│ │ ├── extras.py
│ │ └── responses.py
│ ├── __init__.py
│ ├── py.typed
│ ├── types
│ │ ├── chat_completion.py
│ │ ├── completions.py
│ │ ├── embedding.py
│ │ ├── function_call.py
│ │ ├── __init__.py
│ │ └── responses.py
│ └── utils.py
└── timeout_examples.md
8 directories, 44 files
Bug Reports and Contributions
This project is developed in my spare time. Bugs and issues may exist. If you encounter any or have suggestions for improvements, please open an issue or submit a pull request. Your contributions are highly appreciated!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file argo_proxy-2.7.0.post1.tar.gz.
File metadata
- Download URL: argo_proxy-2.7.0.post1.tar.gz
- Upload date:
- Size: 29.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ac64b07ace0ef282580c44b4a46cf2248b81c2110ffcb3e9a9561418203e5673
|
|
| MD5 |
7ba8e602a2fa1572cb3be180ba8cd8a9
|
|
| BLAKE2b-256 |
a9ab6a180d444026d7ee0094ef76f03fd346778eb8c444d486265ffa4b8f1c3e
|
File details
Details for the file argo_proxy-2.7.0.post1-py3-none-any.whl.
File metadata
- Download URL: argo_proxy-2.7.0.post1-py3-none-any.whl
- Upload date:
- Size: 38.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3ec810091aaf75b22ff147cfba4c7e86df04b6aef46776faa887f361887f096e
|
|
| MD5 |
44c5a84c553aed55eb54a1ceb4076b3d
|
|
| BLAKE2b-256 |
7a7829d2bdee7e6fb42ec9e1ceacc11090c3c73ad29ba3b1a1e21a8fd255d161
|