Local OpenAI-compatible gateway that routes across free-tier providers (OpenRouter, Groq, NVIDIA NIM, Cloudflare Workers AI, HuggingFace) with automatic failover.
Project description
FreeRide
One free AI endpoint. Five providers behind it. Your agents don't need to know.
$ curl -sSL https://free-ride.xyz/install.sh | sh
$ export OPENROUTER_API_KEY=sk-or-v1-...
$ freeride serve
freeride gateway listening on http://127.0.0.1:11343
providers: openrouter # add more by exporting their keys
point any OpenAI-compatible agent at:
OPENAI_API_BASE=http://127.0.0.1:11343/v1
OPENAI_API_KEY=any
That's it. Aider, Continue, OpenClaw, Hermes, the OpenAI Python SDK — anything that speaks OpenAI now speaks every free tier you have a key for.
Demo
┌─ your agent ─────────┐ ┌─ freeride (localhost) ─┐ ┌─ providers ─┐
│ │ POST │ │ │ │
│ chat.completions │────────▶│ pick provider │────────▶│ OpenRouter │ 429
│ .create(...) │ │ pick key (not cooling)│ retry │ ↓ │
│ │ │ forward request │────────▶│ Groq │ ✓
│ ◀───────────────────│ 200 │ ◀─────────────────────│ │ │
│ │ │ │ │ NIM, CF, │
│ │ │ X-FreeRide-Provider: │ │ HF — only │
│ │ │ groq │ │ if needed │
└──────────────────────┘ └────────────────────────┘ └─────────────┘
When OpenRouter rate-limits you, the next request goes to Groq. When Groq's daily token cap hits, the next goes to HuggingFace. Your agent never sees a 429.
Why this exists
You can already get a free tier from OpenRouter. And NVIDIA. And Groq. And Cloudflare Workers AI. And HuggingFace. They all have different limits, different free-detection rules, different ways of saying "you're done for today."
So you sign up for all of them and now you've got five API keys, five SDKs, and an agent that only knows about one. FreeRide is the small thing that sits between them and pretends to be one OpenAI endpoint.
- Local-first. The gateway runs on your machine. Prompts and completions never touch a FreeRide server.
- BYO keys. Bring your own free-tier keys. FreeRide doesn't issue any.
- Free-only. No paid fallback. No upsell. If every provider is exhausted, the request fails — better that than a surprise bill.
Install
curl -sSL https://free-ride.xyz/install.sh | sh
The installer bootstraps uv if missing, then uv tool installs freeride-gateway. Binary lands at ~/.local/bin/freeride. Same shape as the bun.sh and astral.sh installers.
Or install manually
# uv (what the installer does)
uv tool install --prerelease=allow freeride-gateway
# pipx
pipx install --pip-args=--pre freeride-gateway
# pip + venv (the venv only — re-activate per shell)
python3 -m venv .venv && source .venv/bin/activate
pip install --pre freeride-gateway
# from source
git clone https://github.com/Shaivpidadi/FreeRideV3 && cd FreeRideV3
pip install -e .
PyPI distribution: freeride-gateway. CLI: freeride. Python ≥ 3.10.
Get keys (any one is enough; more = better failover)
| Provider | Where | Env var |
|---|---|---|
| OpenRouter | https://openrouter.ai/keys | OPENROUTER_API_KEY |
| Groq | https://console.groq.com/keys | GROQ_API_KEY |
| NVIDIA NIM | https://build.nvidia.com | NVIDIA_API_KEY |
| Cloudflare Workers AI | https://dash.cloudflare.com/profile/api-tokens | CLOUDFLARE_API_TOKEN + CLOUDFLARE_ACCOUNT_ID |
| HuggingFace | https://huggingface.co/settings/tokens | HF_TOKEN |
Set whichever you have, then freeride serve. The gateway picks them up and rotates between them.
Wire your agent
The fastest way is a binder:
freeride bind aider # writes ~/.aider.conf.yml
freeride bind continue # writes ~/.continue/config.yaml
freeride bind hermes # writes ~/.hermes/config.yaml
freeride bind openclaw # writes ~/.openclaw/openclaw.json
Or set the OpenAI vars yourself:
export OPENAI_API_BASE=http://localhost:11343/v1
export OPENAI_API_KEY=any
Anything OpenAI-shaped works. Tested with the openai-python SDK, Aider, Continue, Hermes, OpenClaw.
Multi-key rotation
Got several free keys for the same provider? Pass them as a JSON array:
export OPENROUTER_API_KEY='["sk-or-v1-key1","sk-or-v1-key2","sk-or-v1-key3"]'
When key 1 hits 429 it goes on cooldown for 120s; key 2 takes the next request. Cooldowns persist across restarts (~/.freeride/cooldown.json).
How failover works
Per request, FreeRide walks (provider, key) pairs in order:
RATE_LIMITorAUTH→ mark this key cooling, try the next key.MODEL_NOT_FOUND→ skip this provider, try the next provider.- Anything 5xx-ish → next pair.
- First successful response → ship it; stamp
X-FreeRide-Providerheader (or_freeride_providerfield on JSON) so you can tell who actually served it.
Streaming uses buffer-first-chunk failover: hold the first SSE event until upstream confirms the stream is real. If it fails before the first chunk, retry. After the first chunk has shipped, mid-stream errors propagate (rare; documented).
Telemetry
On by default. Hourly POST to https://telemetry.free-ride.xyz/v1/beacon:
{
"installation_id": "random-uuid-v4",
"version": "0.3.0",
"os": "darwin",
"tokens_served": 412034,
"request_count": 187,
"providers_active": ["openrouter", "groq"],
"uptime_hours": 8
}
Prompts, completions, model IDs, API keys, hostnames, IPs — never sent. The Worker doesn't log cf-connecting-ip. The first time you run any freeride command a banner prints the exact payload.
freeride telemetry off # turn it off
freeride telemetry # show what would be sent
Commands
freeride serve start the gateway
freeride bind <agent> write gateway URL into agent config
freeride telemetry [on|off] manage telemetry
freeride list list available free models
freeride status show OpenClaw config + cache age (v2)
freeride auto auto-configure OpenClaw (v2)
freeride rotate swap primary if it fails (v2)
freeride-watcher background daemon that rotates on failure
The v2 commands keep working for existing OpenClaw users.
Providers
| Provider | Status | Notes |
|---|---|---|
| OpenRouter | shipped | full surface — chat, streaming, tools, vision, structured outputs |
| NVIDIA NIM | shipped | curated free-model allowlist; NVIDIA_NIM_FREE_MODELS_OVERRIDE to expand |
| Groq | shipped | hardcoded allowlist (Llama 3.x, Gemma 2, Mixtral, DeepSeek-R1-distill); GROQ_FREE_MODELS_OVERRIDE to expand |
| Cloudflare Workers AI | shipped | curated allowlist of cheap-per-neuron chat models; needs CLOUDFLARE_ACCOUNT_ID |
| HuggingFace Inference | shipped | full HF router catalog; budget governs access ($0.10/mo Free, $2/mo PRO) |
Adding a sixth: implement freeride.core.provider.Provider (api_version=1) in freeride/providers/<name>.py, register it in the conformance suite, done. See CONTRIBUTING.md.
Agents
| Agent | freeride bind |
Hot reload |
|---|---|---|
| OpenClaw | yes | needs restart |
| Aider | yes (--scope home/cwd/git) |
needs restart |
| Continue | yes | yes |
| Hermes (NousResearch/hermes-agent) | yes | needs restart |
Or anything else: OPENAI_API_BASE=http://localhost:11343/v1 + OPENAI_API_KEY=any.
Claude Code skill
If you use Claude Code, install the FreeRide skill so Claude knows how to detect, wire, and troubleshoot the gateway:
/plugin install https://github.com/Shaivpidadi/FreeRideV3
After install, Claude auto-invokes the skill when you mention FreeRide,
have it running on localhost:11343, or ask about routing across
free-tier providers. See skills/README.md for
manual-install instructions.
Docs
docs/providers/SURVEY.md— Provider Protocol fit per provider (auth shape, free-tier semantics, error mapping)docs/providers/nvidia_nim.md— NVIDIA NIM specifics (free-model allowlist, 403=AUTH quirk)docs/agent-binders.md— per-agent bind reference (config locations, hot-reload behavior, edge cases)docs/hermes.md— Hermes identification + bind planCONTRIBUTING.md— adding a provider or binder
License
MIT.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file freeride_gateway-0.3.0a8.tar.gz.
File metadata
- Download URL: freeride_gateway-0.3.0a8.tar.gz
- Upload date:
- Size: 109.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cb612309838291a7d3568034551fad16380c92ccca2120208fcf03d1b2d9c6d6
|
|
| MD5 |
42156c4f5cb36bf26f204efc461d8717
|
|
| BLAKE2b-256 |
69ddfa7d41a76b36dc79b26205e1667a04268590058c5984d6d557484a432e61
|
Provenance
The following attestation bundles were made for freeride_gateway-0.3.0a8.tar.gz:
Publisher:
release.yml on Shaivpidadi/FreeRideV3
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
freeride_gateway-0.3.0a8.tar.gz -
Subject digest:
cb612309838291a7d3568034551fad16380c92ccca2120208fcf03d1b2d9c6d6 - Sigstore transparency entry: 1463180107
- Sigstore integration time:
-
Permalink:
Shaivpidadi/FreeRideV3@93bd32ba2eea713cf201a7af69abdca3db08eb1c -
Branch / Tag:
refs/tags/v0.3.0a8 - Owner: https://github.com/Shaivpidadi
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@93bd32ba2eea713cf201a7af69abdca3db08eb1c -
Trigger Event:
push
-
Statement type:
File details
Details for the file freeride_gateway-0.3.0a8-py3-none-any.whl.
File metadata
- Download URL: freeride_gateway-0.3.0a8-py3-none-any.whl
- Upload date:
- Size: 75.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
608ec6ae80b42ffce0f491df8bc2e5a75dfb0bd501c31eec75d0912599a2be1d
|
|
| MD5 |
07e0d1938f7b62647904e7ddb80a851b
|
|
| BLAKE2b-256 |
f740c1f63316e087764abfccf983c7a0f3fdd4d5d25be33b813e404ccc9e6584
|
Provenance
The following attestation bundles were made for freeride_gateway-0.3.0a8-py3-none-any.whl:
Publisher:
release.yml on Shaivpidadi/FreeRideV3
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
freeride_gateway-0.3.0a8-py3-none-any.whl -
Subject digest:
608ec6ae80b42ffce0f491df8bc2e5a75dfb0bd501c31eec75d0912599a2be1d - Sigstore transparency entry: 1463180147
- Sigstore integration time:
-
Permalink:
Shaivpidadi/FreeRideV3@93bd32ba2eea713cf201a7af69abdca3db08eb1c -
Branch / Tag:
refs/tags/v0.3.0a8 - Owner: https://github.com/Shaivpidadi
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@93bd32ba2eea713cf201a7af69abdca3db08eb1c -
Trigger Event:
push
-
Statement type: