LangChain tool for x402 service discovery — find any x402-payable API endpoint at runtime
Project description
langchain-x402-discovery
LangChain tool for x402 service discovery — let your agent find and call any paid API endpoint at runtime without hardcoding URLs or API keys.
When your LangChain agent needs web search, image generation, data enrichment, or any other external capability, it calls x402_discover to find the best available service from the live x402 discovery catalog, then calls that endpoint directly using the returned code snippet.
Installation
pip install langchain-x402-discovery
Quick Start
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_x402_discovery import get_x402_discovery_tool
llm = ChatOpenAI(model="gpt-4o")
tools = [get_x402_discovery_tool()]
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant. Use x402_discover to find paid API services when needed."),
("human", "{input}"),
MessagesPlaceholder("agent_scratchpad"),
])
agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
result = executor.invoke({"input": "Find a web search API and tell me the current price per call"})
print(result["output"])
Full Working Example
import os
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_x402_discovery import get_x402_discovery_tool, X402DiscoveryTool
# Option 1: Default tool pointing at the public catalog
tool = get_x402_discovery_tool()
# Option 2: Custom discovery API URL (e.g. a private catalog)
tool = X402DiscoveryTool(
discovery_api_url="https://x402-discovery-api.onrender.com"
)
# Add to any existing LangChain agent alongside your other tools
llm = ChatOpenAI(model="gpt-4o", temperature=0)
prompt = ChatPromptTemplate.from_messages([
("system", """You are an autonomous research assistant.
When you need to call any external API — web search, data enrichment, image analysis,
translation, or any other capability — first call x402_discover with a short description
of what you need. It will return the endpoint URL, price, and a Python code snippet.
Use that snippet to make the actual API call.
Always prefer the x402 catalog over hardcoded APIs."""),
MessagesPlaceholder("chat_history", optional=True),
("human", "{input}"),
MessagesPlaceholder("agent_scratchpad"),
])
agent = create_tool_calling_agent(llm, [tool], prompt)
executor = AgentExecutor(agent=agent, tools=[tool], verbose=True, max_iterations=5)
# The agent autonomously discovers and uses paid API services
response = executor.invoke({
"input": "I need to analyze sentiment in 50 customer reviews. Find the cheapest API for this."
})
print(response["output"])
Tool Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
query |
str |
required | Natural language description of the capability needed |
max_price_usd |
float |
0.50 |
Maximum acceptable price per call in USD |
network |
str |
"base" |
Blockchain network: base, ethereum, or solana |
Tool Output
The tool returns a formatted string containing:
- Service name and endpoint URL
- Price per call in USD
- Uptime % and average latency (ms) — quality signals for routing
- Description of the service
- Python code snippet showing exactly how to call the endpoint
Example output:
Found: SerpAPI Proxy
URL: https://api.example.com/search
Price: $0.01/call
Uptime: 99.7%
Latency: 210ms
Description: Google Search results via x402 micropayment
Python snippet:
import requests
resp = requests.post("https://api.example.com/search",
headers={"X-Payment": "<x402-token>"},
json={"q": "your query"})
print(resp.json())
Adding to an Existing Agent
If you already have a LangChain agent with other tools, just append the discovery tool:
from langchain_x402_discovery import get_x402_discovery_tool
# Your existing tools
existing_tools = [my_calculator_tool, my_file_tool]
# Add x402 discovery
tools = existing_tools + [get_x402_discovery_tool()]
# Rebuild your agent with the extended tool list
agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools)
Using with LCEL (LangChain Expression Language)
from langchain_x402_discovery import get_x402_discovery_tool
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage
llm = ChatOpenAI(model="gpt-4o")
tool = get_x402_discovery_tool()
llm_with_tools = llm.bind_tools([tool])
messages = [HumanMessage("Find an image generation API under $0.10/call")]
response = llm_with_tools.invoke(messages)
# Handle tool call if present
if response.tool_calls:
tool_result = tool.invoke(response.tool_calls[0]["args"])
print(tool_result)
How It Works
- Your agent receives a task requiring an external API (e.g., "search the web for X")
- Agent calls
x402_discoverwith a natural language query (e.g., "web search") - The tool fetches the x402 discovery catalog
- Services are filtered by
max_price_usdand ranked by uptime/latency - The best match is returned with endpoint URL and a ready-to-use code snippet
- Agent uses the snippet to call the service and complete the task
Discovery API
Browse all available services: https://x402-discovery-api.onrender.com
GET /catalog— List all registered x402 services with quality metricsGET /discover?q=<query>— Search services by natural language capabilityGET /health— API health check
Links
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_x402_discovery-1.0.1.tar.gz.
File metadata
- Download URL: langchain_x402_discovery-1.0.1.tar.gz
- Upload date:
- Size: 5.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
86ec4efc62740845d0635bfcf0d09f9d5f5f802e8eb4b17185008a214be71a0f
|
|
| MD5 |
ebaeded6d9584de13cc3827a58cfb2ea
|
|
| BLAKE2b-256 |
bf8f20d5900de4ca6374dc0363efaf07c84cdba46a30e8da030f1c6b10b5cd87
|
File details
Details for the file langchain_x402_discovery-1.0.1-py3-none-any.whl.
File metadata
- Download URL: langchain_x402_discovery-1.0.1-py3-none-any.whl
- Upload date:
- Size: 6.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
861e0851325c00ff02af4f2d93e22ad92502955b1d5bce69a0a9b5f2dffba0fc
|
|
| MD5 |
bbfddff188cab528f352142ab6feedc8
|
|
| BLAKE2b-256 |
7c17482dca7e6f913ec9294056878e91b97e909ba5dba4935787b4b8b218df92
|