Skip to main content

Universal library that converts any API (OpenAPI, WSDL/SOAP, GraphQL, gRPC, AsyncAPI) into LLM-callable tool definitions

Project description

api-to-tools

Universal library that converts any API into LLM-callable tool definitions.

Give it a website URL (with or without credentials) and it returns a list of Tools that can be handed directly to Claude, OpenAI, or an MCP server — no manual tool wiring required.

PyPI License: MIT


What it does

from api_to_tools import discover, AuthConfig

# Public OpenAPI / Swagger site
tools = discover("https://petstore.swagger.io")
# → 20 tools

# Private admin panel (login → auto-discover backend Swagger)
tools = discover(
    "https://admin.example.com/",
    auth=AuthConfig(type="cookie", username="admin", password="admin"),
)
# → 1090 tools (includes path parameters like {stdCtgNo}, body DTOs, enum values)

# Korean legacy enterprise (Nexacro/SSV, no Swagger at all)
tools = discover(
    "https://pro.example.com/",
    auth=AuthConfig(type="cookie", username="user", password="pass"),
)
# → Playwright crawler + SSV parser

One function call, one URL, one account — you get a complete tool catalog.


Installation

pip install api-to-tools

# Optional: browser-based crawling for sites without a Swagger spec
pip install 'api-to-tools[crawler]'
python -m playwright install chromium

Requires Python 3.10+.


Supported sources

Source Status Notes
OpenAPI 3.0 / 3.1 Full body DTO, enum, response schema extraction
Swagger 2.0 (legacy) parameters[].in=body, responses.200.schema
WSDL / SOAP zeep-based, input/output schemas
GraphQL Introspection, selection set auto-build
gRPC / Protobuf .proto file parsing, streaming detection
AsyncAPI 3.0 WebSocket / MQTT operations
Authenticated Swagger Login → guess backend → Bearer probe
Nexacro / SSV Korean enterprise legacy (Lotte, 금융권 등)
JS bundle scanning Static analysis when no spec exists
Playwright crawler Dynamic SPA discovery with safe mode

How discovery works

discover() tries sources in priority order and stops at the first one that works:

1. Direct spec URL (OpenAPI, WSDL, GraphQL)
2. Nexacro platform detection  → Nexacro crawler + SSV parser
3. Well-known paths probe     → /openapi.json, /swagger.json, /api-docs, ...
4. Authenticated Swagger      → login → guess backend → Bearer probe
5. JS bundle static scan      (opt-in: scan_js=True)
6. Playwright dynamic crawl   (opt-in: crawl=True)

Parallel probing and path priority mean most public APIs are discovered in under 2 seconds.


CLI

# Summarize an API
api-to-tools info https://admin.example.com \
  --login-user admin --login-pass admin

# List tools filtered by tag
api-to-tools list https://admin.example.com \
  --login-user admin --login-pass admin \
  --tag "회원 정보 관리"

# Export tool definitions
api-to-tools export https://admin.example.com \
  --login-user admin --login-pass admin \
  --format anthropic > tools.json

# Start an MCP server that exposes discovered APIs
api-to-tools serve https://admin.example.com \
  --login-user admin --login-pass admin \
  --name my-admin-api

Authentication options (any subcommand)

--bearer TOKEN            # Bearer token
--basic USER:PASS         # HTTP Basic
--api-key NAME=VALUE      # API key (header/query)
--cookie NAME=VALUE       # Direct cookie (repeatable)
--header "Name: Value"    # Custom header (repeatable)
--login URL               # Form login URL
--login-user USERNAME     # Login username (shortcut for cookie login)
--login-pass PASSWORD     # Login password

Discovery modes

--scan-js       # Static analysis of JavaScript bundles
--crawl         # Playwright browser crawl
--backend auto  # auto | system | playwright | lightpanda
--no-safe-mode  # DANGEROUS: allows destructive requests to reach server

Python API

Basic usage

from api_to_tools import discover, execute, AuthConfig

tools = discover("https://date.nager.at/openapi/v3.json")
print(f"Found {len(tools)} tools")

# Execute a tool directly
tool = next(t for t in tools if "PublicHolidays" in t.name)
result = execute(tool, {"year": "2026", "countryCode": "KR"})
print(result.data)  # → list of 15 holidays

Authentication

# Basic Auth
AuthConfig(type="basic", username="admin", password="secret")

# Bearer token
AuthConfig(type="bearer", token="eyJ...")

# API key (header or query)
AuthConfig(type="api_key", key="X-API-Key", value="abc", location="header")

# Form login → session cookie
AuthConfig(type="cookie", username="user", password="pass")

# OAuth2 client credentials
AuthConfig(
    type="oauth2_client",
    token_url="https://auth.example.com/token",
    client_id="id",
    client_secret="secret",
)

# Custom headers
AuthConfig(type="custom", headers={"Authorization": "Custom xyz"})

Filters

tools = discover(
    url,
    auth=auth,
    tags=["users", "orders"],             # only specific tags
    methods=["GET"],                      # only GET
    path_filter=r"/api/v2/.*",            # regex on endpoint
    base_url="https://prod.example.com",  # override base URL
)

LLM integration

# Claude / Anthropic
from api_to_tools import to_anthropic_tools
import anthropic

client = anthropic.Anthropic()
response = client.messages.create(
    model="claude-sonnet-4-5",
    tools=to_anthropic_tools(tools),
    messages=[{"role": "user", "content": "Find orders from last week"}],
)

# OpenAI function calling
from api_to_tools import to_function_calling
openai_tools = to_function_calling(tools)

# MCP server (stdio)
from api_to_tools.adapters import create_mcp_server
server = create_mcp_server(tools, name="my-api")
server.run(transport="stdio")

Utilities

from api_to_tools import summarize, group_by_tag, search_tools

summary = summarize(tools)
# {"total": 1090, "by_method": {...}, "by_tag": {...}, "by_protocol": {...}}

groups = group_by_tag(tools)
order_tools = search_tools(tools, "order")

Safe mode

When crawling a live production site, safe_mode=True (default) intercepts mutation requests (POST/PUT/DELETE/PATCH) after login and fakes a success response. The request is still captured for discovery, but never reaches the server — so deleteUser, save, send etc. can't cause damage.

discover(url, auth=auth, crawl=True, safe_mode=True)

A smart heuristic allows read-style POSTs (getUserList, searchItems, auth endpoints) to pass through, since they're common in RPC-style APIs.


Architecture

api_to_tools/
├── core.py              # discover, to_tools, execute
├── types.py             # Tool, ToolParameter, AuthConfig, …
├── constants.py         # Timeouts, well-known paths, keywords
├── auth.py              # Auth config → HTTP headers/cookies
│
├── detector/
│   ├── __init__.py            # Spec type detection, parallel probing
│   └── swagger_discovery.py   # Authenticated backend Swagger hunting
│
├── parsers/
│   ├── openapi.py       # OpenAPI 3.x + Swagger 2.0
│   ├── wsdl.py          # WSDL/SOAP via zeep
│   ├── graphql.py       # GraphQL introspection
│   ├── grpc.py          # .proto parsing
│   ├── ssv.py           # Nexacro SSV format
│   ├── nexacro.py       # Nexacro-specific crawler
│   ├── crawler.py       # Generic Playwright crawler
│   ├── jsbundle.py      # Static JS bundle scanner
│   ├── _param_builder.py  # Shared ToolParameter helpers
│   └── _browser_utils.py  # Shared Playwright helpers
│
├── executors/
│   ├── rest.py          # REST + Nexacro SSV execution
│   ├── soap.py          # SOAP calls via zeep
│   └── graphql.py       # GraphQL query execution
│
└── adapters/
    ├── formats.py       # OpenAI / Anthropic tool format
    └── mcp_adapter.py   # MCP server generation

Development

git clone https://github.com/SonAIengine/api-to-tools.git
cd api-to-tools
pip install -e '.[dev]'

pytest              # 89 unit tests

Debug logging

from api_to_tools import enable_debug_logging
enable_debug_logging()

License

MIT © SonAIengine

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

api_to_tools-0.14.0.tar.gz (62.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

api_to_tools-0.14.0-py3-none-any.whl (72.0 kB view details)

Uploaded Python 3

File details

Details for the file api_to_tools-0.14.0.tar.gz.

File metadata

  • Download URL: api_to_tools-0.14.0.tar.gz
  • Upload date:
  • Size: 62.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.1

File hashes

Hashes for api_to_tools-0.14.0.tar.gz
Algorithm Hash digest
SHA256 25677b00e50b3a06871b9e13164652ece426ffcd968dcb0c24aa0c7eceb07069
MD5 21e598f2daa3e70915e3c9431841ac75
BLAKE2b-256 541adc294db49ddd3f82da731426fd4d8e0283d9a01cc11f7363c76930008d0d

See more details on using hashes here.

File details

Details for the file api_to_tools-0.14.0-py3-none-any.whl.

File metadata

  • Download URL: api_to_tools-0.14.0-py3-none-any.whl
  • Upload date:
  • Size: 72.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.1

File hashes

Hashes for api_to_tools-0.14.0-py3-none-any.whl
Algorithm Hash digest
SHA256 47858c0dc12eaf1fedbdc0f3ac02ebb50bd01dcd7cf37c4c6461c5b017559104
MD5 e4291798c54cc44959d13b3b0271917c
BLAKE2b-256 74a03297dbe4fb124b81fa14b7a363826f11312a28d4f9644932c50060d9ef21

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page