MCP server for HyperStore — discover 1000+ AI apps from any LLM.
Project description
HyperStore MCP
Plug 6,500+ AI apps into any LLM via the Model Context Protocol.
HyperStore is a curated directory of 6,500+ AI applications, developed by HyperGPT. This MCP server exposes the HyperStore catalog to any LLM client — Claude, ChatGPT, Cursor, Windsurf, Cline, Zed, Gemini, and anything else that speaks MCP.
Ask your LLM:
"Find me a free AI tool that summarises PDFs." "Compare ChatGPT, Claude, and Gemini side-by-side." "Show me the top 5 image-generation apps with an API."
The LLM calls HyperStore MCP behind the scenes and answers with up-to-date, curated results.
What you get
8 tools:
| Tool | Purpose |
|---|---|
search_apps |
Full-text keyword search |
ai_search |
Embedding-based semantic search |
get_app |
Full app detail (features, screenshots, pricing) |
list_apps |
Paginated apps with filters (category, pricing) |
list_categories |
Browse all 30+ categories |
category_apps |
Apps within a category |
browse_apps |
A-Z directory listing |
get_homepage |
Trending + top categories overview |
3 resources:
hyperstore://app/{slug}— markdown rendering of any apphyperstore://category/{slug}— top apps in a categoryhyperstore://catalog— full category index
3 prompts:
find_tool_for_task— guided discovery for a taskcompare_apps— side-by-side app comparisondiscover_category— explore a topic
Install
Option A — uvx (zero install, recommended)
Requires uv. One command and you're done:
uvx hyperstore-mcp
Option B — pipx
pipx install hyperstore-mcp
hyperstore-mcp
Option C — Docker (for remote hosting)
docker run --rm -p 8080:8080 ghcr.io/deficlow/hyperstore-mcp
# Now MCP Streamable HTTP at http://localhost:8080/mcp
Option D — Hosted endpoint (no install)
Use our managed Streamable HTTP server:
https://mcp.store.hypergpt.ai/mcp
Connect from your LLM client
Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Restart Claude → tools appear in the 🛠 menu.
Claude Code
claude mcp add hyperstore -- uvx hyperstore-mcp
Cursor
.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Windsurf
~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Cline (VS Code)
settings.json:
{
"cline.mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Zed
~/.config/zed/settings.json:
{
"context_servers": {
"hyperstore": {
"command": {
"path": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
}
Gemini CLI
~/.gemini/settings.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
ChatGPT (Pro / Team / Enterprise)
Settings → Connectors → Add custom connector:
- Name: HyperStore
- MCP Server URL:
https://mcp.store.hypergpt.ai/mcp - Authentication: None
OpenAI Responses API
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4.1",
tools=[{
"type": "mcp",
"server_label": "hyperstore",
"server_url": "https://mcp.store.hypergpt.ai/mcp",
"require_approval": "never",
}],
input="Find me 3 free AI tools for writing unit tests.",
)
print(response.output_text)
Anthropic Messages API
from anthropic import Anthropic
client = Anthropic()
response = client.messages.create(
model="claude-opus-4-7",
max_tokens=1024,
mcp_servers=[{
"type": "url",
"url": "https://mcp.store.hypergpt.ai/mcp",
"name": "hyperstore",
}],
messages=[{"role": "user", "content": "Top 5 AI image generators?"}],
)
See examples/ for ready-to-paste configs for every supported client.
Run as a remote server
# Streamable HTTP (modern, ChatGPT/OpenAI/Anthropic)
hyperstore-mcp --transport http --host 0.0.0.0 --port 8080
# Legacy SSE (older MCP clients)
hyperstore-mcp --transport sse --port 8080
The hosted endpoint at https://mcp.store.hypergpt.ai runs the Docker image
behind a CDN — no auth, rate-limited per IP.
Configuration
All settings come from environment variables (see .env.example):
| Variable | Default | Purpose |
|---|---|---|
HYPERSTORE_API_BASE |
https://store.hypergpt.ai |
Upstream API base URL |
HYPERSTORE_TIMEOUT |
20 |
HTTP timeout in seconds |
HYPERSTORE_USER_AGENT |
hyperstore-mcp/{version} |
UA string |
MCP_HOST |
0.0.0.0 |
Bind host (http/sse only) |
MCP_PORT |
8080 |
Bind port (http/sse only) |
LOG_LEVEL |
INFO |
Logging level |
Development
git clone https://github.com/deficlow/HyperStore-MCP
cd HyperStore-MCP
uv sync --all-extras
uv run pytest
uv run hyperstore-mcp # stdio mode for local testing
Inspect the running server with the official MCP Inspector:
npx @modelcontextprotocol/inspector uvx hyperstore-mcp
How it works
HyperStore MCP is a thin async wrapper around the HyperStore public REST API. It is read-only — no credentials, no writes, no PII. The same data that powers the website powers the MCP server. Updates land in your LLM the moment they land on the site.
LLM client ──MCP──▶ hyperstore-mcp ──HTTPS──▶ store.hypergpt.ai/api
License
MIT © HyperGPT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hyperstore_mcp-0.1.0.tar.gz.
File metadata
- Download URL: hyperstore_mcp-0.1.0.tar.gz
- Upload date:
- Size: 11.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f29ad8feb0213a6eca1fcdfe7f0c9c4ad26562ff47f517e5ecc7d37560afe832
|
|
| MD5 |
5821f0f2cd371f7308c8470af0fcb6af
|
|
| BLAKE2b-256 |
8e6f139af63d1b1fd9c4d5d2a9006ab886806fa3b708832e2cbc35cafc225e59
|
Provenance
The following attestation bundles were made for hyperstore_mcp-0.1.0.tar.gz:
Publisher:
publish-pypi.yml on deficlow/HyperStore-MCP
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hyperstore_mcp-0.1.0.tar.gz -
Subject digest:
f29ad8feb0213a6eca1fcdfe7f0c9c4ad26562ff47f517e5ecc7d37560afe832 - Sigstore transparency entry: 1553681880
- Sigstore integration time:
-
Permalink:
deficlow/HyperStore-MCP@d0cf81af27114fab2fed5b302e15567bbee64ef0 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/deficlow
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@d0cf81af27114fab2fed5b302e15567bbee64ef0 -
Trigger Event:
push
-
Statement type:
File details
Details for the file hyperstore_mcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: hyperstore_mcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 13.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
895d25cb75e66683550b13571b583b80929a61b4a453de89da71a93cc4c31fe4
|
|
| MD5 |
a7fd2dd079553c1af857b468fd963840
|
|
| BLAKE2b-256 |
641d464eb5cad9f178efe2089c9825074ccdd553d9c926ac98f0b795dd971532
|
Provenance
The following attestation bundles were made for hyperstore_mcp-0.1.0-py3-none-any.whl:
Publisher:
publish-pypi.yml on deficlow/HyperStore-MCP
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hyperstore_mcp-0.1.0-py3-none-any.whl -
Subject digest:
895d25cb75e66683550b13571b583b80929a61b4a453de89da71a93cc4c31fe4 - Sigstore transparency entry: 1553681956
- Sigstore integration time:
-
Permalink:
deficlow/HyperStore-MCP@d0cf81af27114fab2fed5b302e15567bbee64ef0 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/deficlow
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@d0cf81af27114fab2fed5b302e15567bbee64ef0 -
Trigger Event:
push
-
Statement type: