OpenAI-compatible API proxy for chatjimmy.ai
Project description
chatjimmy-proxy
OpenAI-compatible HTTP proxy for chatjimmy.ai. Point any OpenAI SDK or tool at
it and use model jimmy.
Quick start
- clone and install:
</code></pre> </li> </ol> <p>git clone <repo> cd chatjimmy-proxy uv sync uv run playwright install chromium</p> <pre><code>2. configure: ```bash cp .env.example .env # edit PROXY_API_KEY (leave blank to disable auth)
Note: if you have a
PROXY_API_KEYset in your shell environment (for example some systems default it to your username), the proxy will require that exact value in theAuthorizationheader. Useexport PROXY_API_KEY=to clear it, or choose a different secret.- run discovery once:
</code></pre> </li> </ol> <p>uv run chatjimmy-discover</p> <pre><code>4. start proxy (default port 8000, change with `PORT` env var): ```bash uv run chatjimmy-proxy # or explicitly: uv run uvicorn chatjimmy_proxy.main:app --host 0.0.0.0 --port ${PORT:-8000}
If you see "address already in use" set
PORTto a free port (e.g. 8001) or kill the process currently listening on the port.Usage
curl http://localhost:8000/v1/chat/completions \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $PROXY_API_KEY" \ -d '{"model":"jimmy","messages":[{"role":"user","content":"Hi"}]}'
Streaming: add
--no-bufferand"stream": true.Python example:
from openai import OpenAI client = OpenAI(base_url="http://localhost:8000/v1", api_key="key") print(client.chat.completions.create(model="jimmy", messages=[{"role":"user","content":"Hi"}]).choices[0].message.content)
Editors and agents
Any tool that lets you supply a custom OpenAI‑compatible provider should work. You need three things:
- Base URL – the root of the OpenAI API, not a specific endpoint. Use
http://<host>:<port>/v1(omit/chat/completions); most clients append the path themselves. If you include/chat/completionstwice you'll see 404s like/v1/chat/completions/chat/completions. - API key – the secret from
.env(or any string if auth is off). - Model –
jimmy.
Correct Roo Code configuration example:
{ "provider": "OpenAI Compatible", "baseUrl": "http://localhost:8000/v1", "apiKey": "<your-proxy-key>", "model": "jimmy" }
If you accidentally set the base URL to include
/chat/completions, the agent will produce 404 errors when it tries to call/v1/chat/completions/chat/completions.Development
uv run pytest tests/ -v uv run ruff check src/ uv run ruff format src/Packaging & publishing
Build with
uv run hatch build. Releases are made by taggingvX.Y.Z; GitHub Actions tests and publishes to PyPI usingPYPI_API_TOKEN.Troubleshooting
-
Port already in use – if the proxy fails to start with
address already in use:# locate the offending PID sudo lsof -i :8000 -t # substitute whatever port you were using # kill it (or choose a different port) sudo kill <pid> # or directly: sudo kill -9 $(sudo lsof -i :8000 -t)
alternatively, set
PORTto a free port before launching:PORT=8001 uv run chatjimmy-proxy
-
Discovery failures – run with
HEADLESS=falseor switch tomode: browser_relayin blueprint. -
401/403 – clear
.jimmy_blueprint.json/.jimmy_state.jsonand re-run discovery; ensurePROXY_API_KEYis correct. -
Slow first response – discovery runs on startup; subsequent requests are fast under HTTP‑replay mode.
License
MIT – see LICENSE.
- Base URL – the root of the OpenAI API, not a specific endpoint. Use
- run discovery once:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatjimmy_proxy-0.1.0.tar.gz.
File metadata
- Download URL: chatjimmy_proxy-0.1.0.tar.gz
- Upload date:
- Size: 3.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2e18a6567d2341296982f66dc2a3792c6fc3ee9e640ac3db021741fd20a997b3
|
|
| MD5 |
813fd3f564e1c1b71980530efa4e6b92
|
|
| BLAKE2b-256 |
f4eb790fce56840ae8d3c9de62601c6b8f0baf8ef6568182d61d1fa319a454c5
|
File details
Details for the file chatjimmy_proxy-0.1.0-py3-none-any.whl.
File metadata
- Download URL: chatjimmy_proxy-0.1.0-py3-none-any.whl
- Upload date:
- Size: 19.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6eeda9833d293bf5716c74fbe8a8877b7a236cc89fcacba15387e5c5736e8e03
|
|
| MD5 |
efa496d871b1f717c65e4bb3ebb16bce
|
|
| BLAKE2b-256 |
c339c3d43d0902afa3c127b1ddcd2b377adb00914866b4a0a42064f2b82fa162
|