Universal MCP server for querying GraphQL, REST, and gRPC APIs using natural language
Project description
Ratatoskr
Turn any API into an MCP server. Query in English. Get results — even when the API can't.
Quick Start · How It Works · Providers · Reference · Development
Ratatoskr is a polyglot-LLM fork of agoda-com/api-agent — Agoda's universal API-to-MCP bridge. This fork adds first-class Anthropic and OpenAI-compatible (Ollama, LM Studio, vLLM) provider support alongside the original OpenAI backend. All credit for the core architecture goes to the Agoda engineering team.
Point at any GraphQL or REST API. Ask questions in natural language. The agent fetches data, stores it in DuckDB, and runs SQL post-processing. Rankings, filters, JOINs work even if the API doesn't support them.
What Makes It Different
Zero config. No custom MCP code per API. Point at a GraphQL endpoint or OpenAPI spec — schema introspected automatically.
SQL post-processing. API returns 10,000 unsorted rows? Agent ranks top 10. No GROUP BY? Agent aggregates. Need JOINs across endpoints? Agent combines.
Safe by default. Read-only. Mutations blocked unless explicitly allowed.
Recipe learning. Successful queries become cached pipelines. Reuse instantly without LLM reasoning.
Polyglot LLM. Run with OpenAI, Anthropic (Claude), or any OpenAI-compatible endpoint — same capabilities, your choice of model.
Quick Start
1. Run (choose one):
# OpenAI (default)
OPENAI_API_KEY=your_key uv run api-agent
# Anthropic (Claude)
uv run api-agent --provider anthropic --api-key your_key
# Local model (Ollama, LM Studio, vLLM)
uv run api-agent --provider openai-compat --base-url http://localhost:11434/v1 --model llama3
# Or Docker (OpenAI)
docker build -t ratatoskr .
docker run -p 3000:3000 -e OPENAI_API_KEY=your_key ratatoskr
2. Add to any MCP client:
{
"mcpServers": {
"rickandmorty": {
"url": "http://localhost:3000/mcp",
"headers": {
"X-Target-URL": "https://rickandmortyapi.com/graphql",
"X-API-Type": "graphql"
}
}
}
}
3. Ask questions:
- "Show characters from Earth, only alive ones, group by species"
- "Top 10 characters by episode count"
- "Compare alive vs dead by species, only species with 10+ characters"
That's it. Agent introspects schema, generates queries, runs SQL post-processing.
More Examples
REST API (Petstore):
{
"mcpServers": {
"petstore": {
"url": "http://localhost:3000/mcp",
"headers": {
"X-Target-URL": "https://petstore3.swagger.io/api/v3/openapi.json",
"X-API-Type": "rest"
}
}
}
}
Your own API with auth:
{
"mcpServers": {
"myapi": {
"url": "http://localhost:3000/mcp",
"headers": {
"X-Target-URL": "https://api.example.com/graphql",
"X-API-Type": "graphql",
"X-Target-Headers": "{\"Authorization\": \"Bearer YOUR_TOKEN\"}"
}
}
}
}
How It Works
sequenceDiagram
participant U as User
participant M as MCP Server
participant A as Agent
participant G as Target API
U->>M: Question + Headers
M->>G: Schema introspection
G-->>M: Schema
M->>A: Schema + question
A->>G: API call
G-->>A: Data stored in DuckDB
A->>A: SQL post-processing
A-->>M: Summary
M-->>U: {ok, data, queries[]}
Architecture
flowchart TB
subgraph Client["MCP Client"]
H["Headers: X-Target-URL, X-API-Type"]
end
subgraph MCP["MCP Server (FastMCP)"]
Q["{prefix}_query"]
E["{prefix}_execute"]
R["r_{recipe} (dynamic)"]
end
subgraph Agent["Agents (Polyglot LLM)"]
GA["GraphQL Agent"]
RA["REST Agent"]
end
subgraph Exec["Executors"]
HTTP["HTTP Client"]
Duck["DuckDB"]
end
Client -->|NL + headers| MCP
Q -->|graphql| GA
Q -->|rest| RA
E --> HTTP
R -->|"no LLM"| HTTP
R --> Duck
GA --> HTTP
RA --> HTTP
GA --> Duck
RA --> Duck
HTTP --> API[Target API]
Stack: FastMCP · OpenAI / Anthropic / OpenAI-compatible · DuckDB
Recipe Learning
Agent learns reusable patterns from successful queries:
- Executes — API calls + SQL via LLM reasoning
- Extracts — LLM converts trace into parameterized template
- Caches — Stores recipe keyed by (API, schema hash)
- Exposes — Recipe becomes MCP tool (
r_{name}) callable without LLM
flowchart LR
subgraph First["First Query via {prefix}_query"]
Q1["'Top 5 users by age'"]
A1["Agent reasons"]
E1["API + SQL"]
R1["Recipe extracted"]
end
subgraph Tools["MCP Tools"]
T["r_get_top_users<br/>params: {limit}"]
end
subgraph Reuse["Direct Call"]
Q2["r_get_top_users({limit: 10})"]
X["Execute directly"]
end
Q1 --> A1 --> E1 --> R1 --> T
Q2 --> T --> X
Recipes auto-expire on schema changes. Disable with API_AGENT_ENABLE_RECIPES=false.
Providers
Ratatoskr supports multiple LLM providers through a thin abstraction layer.
OpenAI (default)
OPENAI_API_KEY=sk-... uv run api-agent
Anthropic (Claude)
# Via CLI
uv run api-agent --provider anthropic --api-key sk-ant-...
# Via env vars
API_AGENT_PROVIDER=anthropic ANTHROPIC_API_KEY=sk-ant-... uv run api-agent
# Custom model
uv run api-agent --provider anthropic --model claude-opus-4-20250514
Local Models (Ollama, LM Studio, vLLM)
# Ollama
uv run api-agent --provider openai-compat \
--base-url http://localhost:11434/v1 \
--model llama3
# LM Studio
uv run api-agent --provider openai-compat \
--base-url http://localhost:1234/v1 \
--model local-model
# vLLM
uv run api-agent --provider openai-compat \
--base-url http://gpu-server:8000/v1 \
--model mistral-7b
Note: Local models must support tool/function calling for full functionality. If an endpoint doesn't support tools, the agent will retry without them (graceful degradation).
Reference
Headers
| Header | Required | Description |
|---|---|---|
X-Target-URL |
Yes | GraphQL endpoint OR OpenAPI spec URL |
X-API-Type |
Yes | graphql or rest |
X-Target-Headers |
No | JSON auth headers, e.g. {"Authorization": "Bearer xxx"} |
X-API-Name |
No | Override tool name prefix (default: auto-generated) |
X-Base-URL |
No | Override base URL for REST API calls |
X-Allow-Unsafe-Paths |
No | JSON array of glob patterns for POST/PUT/DELETE/PATCH |
X-Poll-Paths |
No | JSON array of paths requiring polling (enables poll tool) |
X-Include-Result |
No | Include full uncapped result field in output |
MCP Tools
Core tools (2 per API):
| Tool | Input | Output |
|---|---|---|
{prefix}_query |
Natural language question | {ok, data, queries/api_calls} |
{prefix}_execute |
GraphQL: query, variables / REST: method, path, params |
{ok, data} |
Tool names auto-generated from URL (e.g., example_query). Override with X-API-Name.
Recipe tools (dynamic, added as recipes are learned):
| Tool | Input | Output |
|---|---|---|
r_{recipe_slug} |
flat recipe-specific params, return_directly (bool) |
CSV or {ok, data, executed_queries/calls} |
Cached pipelines, no LLM reasoning. Appear after successful queries. Clients notified via tools/list_changed.
CLI Arguments
| Argument | Description |
|---|---|
--provider |
LLM provider: openai, anthropic, or openai-compat |
--model |
Model name (default: provider-specific) |
--api-key |
API key (overrides env vars) |
--base-url |
Custom LLM endpoint (required for openai-compat) |
--port |
Server port (default: 3000) |
--host |
Server host (default: 0.0.0.0) |
--transport |
MCP transport: http, streamable-http, sse |
--debug |
Enable debug logging |
CLI arguments override environment variables.
Configuration (env vars)
| Variable | Required | Default | Description |
|---|---|---|---|
API_AGENT_PROVIDER |
No | openai |
LLM provider (openai, anthropic, openai-compat) |
API_AGENT_API_KEY |
Yes | - | API key (also accepts OPENAI_API_KEY, ANTHROPIC_API_KEY) |
API_AGENT_BASE_URL |
No* | - | Custom LLM endpoint (*required for openai-compat) |
API_AGENT_MODEL_NAME |
No | (provider default) | Model name |
API_AGENT_PORT |
No | 3000 | Server port |
API_AGENT_ENABLE_RECIPES |
No | true | Enable recipe learning & caching |
API_AGENT_RECIPE_CACHE_SIZE |
No | 64 | Max cached recipes (LRU eviction) |
OTEL_EXPORTER_OTLP_ENDPOINT |
No | - | OpenTelemetry tracing endpoint |
Provider defaults:
| Provider | Default model | API key env var |
|---|---|---|
openai |
gpt-4o |
OPENAI_API_KEY |
anthropic |
claude-sonnet-4-20250514 |
ANTHROPIC_API_KEY |
openai-compat |
gpt-4o |
(optional) |
Roadmap
Planned improvements (contributions welcome):
- Streaming responses — Stream agent reasoning and partial results to MCP clients
- Mutation support — Controlled write operations with confirmation flows
- Schema caching — Cache introspected schemas to reduce startup latency
- Multi-API joins — Query across multiple APIs in a single request
- Recipe sharing — Export/import learned recipes between instances
- WebSocket subscriptions — Support GraphQL subscriptions for real-time data
- Plugin system — Custom pre/post-processing hooks for API responses
Development
git clone https://github.com/innago-property-management/ratatoskr.git
cd ratatoskr
uv sync --group dev
uv run pytest tests/ -v # Tests (511 passing)
uv run ruff check api_agent/ # Lint
uv run ty check # Type check
Observability
Set OTEL_EXPORTER_OTLP_ENDPOINT to enable OpenTelemetry tracing. Works with Jaeger, Zipkin, Grafana Tempo, Arize Phoenix.
Origin & Attribution
Ratatoskr is a fork of api-agent by Agoda, licensed under the MIT License.
The core architecture — FastMCP server, dynamic tool naming, agent orchestration, DuckDB post-processing, and recipe learning — is entirely Agoda's work. Ratatoskr extends it with:
- Polyglot LLM support — Anthropic, OpenAI, and OpenAI-compatible providers via a pluggable
LLMProviderabstraction - Expanded test coverage — 511 tests covering orchestration, safety boundaries, configuration contracts, and provider SDK surfaces
- GraphQL partial success fix — Returns both
dataanderrorswhen both present, per the GraphQL specification
The name Ratatoskr comes from the Norse squirrel who runs up and down Yggdrasil carrying messages between realms — a fitting metaphor for a universal API-to-LLM bridge.
Upstream: agoda-com/api-agent · Blog post
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file api_agent_ratatoskr-0.1.0.tar.gz.
File metadata
- Download URL: api_agent_ratatoskr-0.1.0.tar.gz
- Upload date:
- Size: 1.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aaacf67c2dd46e99bf9535aaa1b4fcda1812ab524d178fc73c8ec404e06ea91e
|
|
| MD5 |
4ea94518d98acd33b727fa69a63d68d7
|
|
| BLAKE2b-256 |
a0cd164106eacf398d5c0954533ae70ae9e9982b1b685eae69f48cc923a8c679
|
Provenance
The following attestation bundles were made for api_agent_ratatoskr-0.1.0.tar.gz:
Publisher:
release.yml on innago-property-management/ratatoskr
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
api_agent_ratatoskr-0.1.0.tar.gz -
Subject digest:
aaacf67c2dd46e99bf9535aaa1b4fcda1812ab524d178fc73c8ec404e06ea91e - Sigstore transparency entry: 997989851
- Sigstore integration time:
-
Permalink:
innago-property-management/ratatoskr@898d0140b683be90a9354dcced489f2561d2482e -
Branch / Tag:
refs/tags/0.3.0 - Owner: https://github.com/innago-property-management
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@898d0140b683be90a9354dcced489f2561d2482e -
Trigger Event:
release
-
Statement type:
File details
Details for the file api_agent_ratatoskr-0.1.0-py3-none-any.whl.
File metadata
- Download URL: api_agent_ratatoskr-0.1.0-py3-none-any.whl
- Upload date:
- Size: 85.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
da6376cc155323b1d7e3d2479d5cac78c273ff0f7ca57c51041f071663af3da4
|
|
| MD5 |
e51ffbfb5814f0865ab8d856cf945ab0
|
|
| BLAKE2b-256 |
3101ecdbb8cf16f04b123c34e7fb06a518a1e364471273e21927b0fcd6082686
|
Provenance
The following attestation bundles were made for api_agent_ratatoskr-0.1.0-py3-none-any.whl:
Publisher:
release.yml on innago-property-management/ratatoskr
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
api_agent_ratatoskr-0.1.0-py3-none-any.whl -
Subject digest:
da6376cc155323b1d7e3d2479d5cac78c273ff0f7ca57c51041f071663af3da4 - Sigstore transparency entry: 997989897
- Sigstore integration time:
-
Permalink:
innago-property-management/ratatoskr@898d0140b683be90a9354dcced489f2561d2482e -
Branch / Tag:
refs/tags/0.3.0 - Owner: https://github.com/innago-property-management
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@898d0140b683be90a9354dcced489f2561d2482e -
Trigger Event:
release
-
Statement type: