LangChain integration for TensorFeed.ai: AI news, service status, model pricing, and benchmarks for AI agents and RAG pipelines.
Project description
langchain-tensorfeed
LangChain integration for TensorFeed.ai, the AI news and real-time data hub built for humans and AI agents.
This package gives LangChain agents zero-friction access to:
- The latest AI news from 12+ industry sources
- Live up/down status for Claude, ChatGPT, Gemini, Copilot, Perplexity, Mistral, HuggingFace, and more
- Current per-token pricing for AI models across every major provider
- Public benchmark scores (MMLU, HumanEval, GPQA, MATH, SWE-Bench, etc.)
- A
Documentloader so you can index TensorFeed news into a vector store for RAG
All endpoints used by this package are free and require no API key.
Installation
pip install langchain-tensorfeed
Quick start: tools
from langchain_tensorfeed import (
TensorFeedNewsTool,
TensorFeedStatusTool,
TensorFeedPricingTool,
TensorFeedBenchmarksTool,
)
news = TensorFeedNewsTool().invoke({"category": "anthropic", "limit": 5})
print(news)
status = TensorFeedStatusTool().invoke({"service": "claude"})
print(status)
pricing = TensorFeedPricingTool().invoke({"provider": "openai"})
print(pricing)
scores = TensorFeedBenchmarksTool().invoke({"benchmark": "MMLU"})
print(scores)
Each tool returns a JSON string sized for token efficiency. The schemas are validated by Pydantic, so an LLM can call them directly through the standard tool-calling interface.
Quick start: document loader
from langchain_tensorfeed import TensorFeedLoader
loader = TensorFeedLoader(
category="research",
limit=100,
start_date="2026-04-01T00:00:00Z",
)
docs = loader.load()
for d in docs[:3]:
print(d.metadata["title"], "->", d.metadata["url"])
TensorFeedLoader returns standard langchain_core.documents.Document objects with page_content set to title + snippet and metadata carrying id, url, source, categories, published_at, and fetched_at. You can plug it into any LangChain text splitter or vector store.
Using the tools with an agent
from langchain_anthropic import ChatAnthropic
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate
from langchain_tensorfeed import (
TensorFeedBenchmarksTool,
TensorFeedNewsTool,
TensorFeedPricingTool,
TensorFeedStatusTool,
)
llm = ChatAnthropic(model="claude-opus-4-7")
tools = [
TensorFeedNewsTool(),
TensorFeedStatusTool(),
TensorFeedPricingTool(),
TensorFeedBenchmarksTool(),
]
prompt = ChatPromptTemplate.from_messages([
("system", "You are an AI industry analyst. Use the TensorFeed tools to ground every answer in current data."),
("human", "{input}"),
("placeholder", "{agent_scratchpad}"),
])
agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
result = executor.invoke({
"input": "Is Claude up right now, and how does its pricing compare to GPT-4o?"
})
print(result["output"])
Tool reference
| Tool | Description | Input schema |
|---|---|---|
TensorFeedNewsTool |
Latest AI news headlines | category (optional), limit (1-50, default 10) |
TensorFeedStatusTool |
Live AI service status | service (optional name) |
TensorFeedPricingTool |
Per-token pricing | provider, model (both optional substring filters) |
TensorFeedBenchmarksTool |
Public benchmark scores | benchmark, model (both optional substring filters) |
TensorFeedLoader options
| Argument | Type | Description |
|---|---|---|
category |
str |
API-side category filter |
categories |
Sequence[str] |
Client-side multi-category filter |
limit |
int |
Max articles to fetch (default 50) |
start_date |
`str | datetime` |
end_date |
`str | datetime` |
base_url |
str |
Override the API host |
timeout |
float |
HTTP timeout in seconds |
Premium endpoints
The TensorFeed API also exposes paid endpoints (model routing recommendations, news search, model comparisons, forecasts, webhook watches) that are billed in USDC on Base. Those are out of scope for this package; if you need them in LangChain, the standalone tensorfeed Python SDK covers the full surface and you can wrap any endpoint as a custom BaseTool.
Contributing
Source: github.com/RipperMercs/tensorfeed under sdk/langchain-python/.
cd sdk/langchain-python
pip install -e .[dev]
pytest
Links
- TensorFeed developer docs: https://tensorfeed.ai/developers
- API reference: https://tensorfeed.ai/api/meta
- Issue tracker: https://github.com/RipperMercs/tensorfeed/issues
- License: MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_tensorfeed-0.1.0.tar.gz.
File metadata
- Download URL: langchain_tensorfeed-0.1.0.tar.gz
- Upload date:
- Size: 14.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c00e75d3bd62d503dd0d0d1fa1f7ca8cd1b94db13d7a64d698c5ee207a1cc7ed
|
|
| MD5 |
da593d84b2743de30d2d57aca91bd56d
|
|
| BLAKE2b-256 |
8bac04691a2d6d76eea85a54b03ab89dccb165e22532dc4e561183a1d05d9eef
|
File details
Details for the file langchain_tensorfeed-0.1.0-py3-none-any.whl.
File metadata
- Download URL: langchain_tensorfeed-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dfa422b725f0c37f52ac94c1ea09e1a1fcb662695a7ae0f3e70bac686d5c1a54
|
|
| MD5 |
2cc58275100ff1d6dd618738a2853dca
|
|
| BLAKE2b-256 |
2a7f7a30c854b2fb6d13da47a29866a33c0c09f457ec4b52db8246ddd45a290c
|