Skip to main content

MCP server for HyperStore — discover 1000+ AI apps from any LLM.

Project description

HyperStore MCP

Plug 6,500+ AI apps into any LLM via the Model Context Protocol.

PyPI CI License: MIT

HyperStore is a curated directory of 6,500+ AI applications, developed by HyperGPT. This MCP server exposes the HyperStore catalog to any LLM client — Claude, ChatGPT, Cursor, Windsurf, Cline, Zed, Gemini, and anything else that speaks MCP.

Ask your LLM:

"Find me a free AI tool that summarises PDFs." "Compare ChatGPT, Claude, and Gemini side-by-side." "Show me the top 5 image-generation apps with an API."

The LLM calls HyperStore MCP behind the scenes and answers with up-to-date, curated results.


What you get

8 tools:

Tool Purpose
search_apps Full-text keyword search
ai_search Embedding-based semantic search
get_app Full app detail (features, screenshots, pricing)
list_apps Paginated apps with filters (category, pricing)
list_categories Browse all 30+ categories
category_apps Apps within a category
browse_apps A-Z directory listing
get_homepage Trending + top categories overview

3 resources:

  • hyperstore://app/{slug} — markdown rendering of any app
  • hyperstore://category/{slug} — top apps in a category
  • hyperstore://catalog — full category index

3 prompts:

  • find_tool_for_task — guided discovery for a task
  • compare_apps — side-by-side app comparison
  • discover_category — explore a topic

Install

Option A — uvx (zero install, recommended)

Requires uv. One command and you're done:

uvx hyperstore-mcp

Option B — pipx

pipx install hyperstore-mcp
hyperstore-mcp

Option C — Docker (for remote hosting)

docker run --rm -p 8080:8080 ghcr.io/deficlow/hyperstore-mcp
# Now MCP Streamable HTTP at http://localhost:8080/mcp

Option D — Hosted endpoint (no install)

Use our managed Streamable HTTP server:

https://mcp.store.hypergpt.ai/mcp

Connect from your LLM client

Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "hyperstore": {
      "command": "uvx",
      "args": ["hyperstore-mcp"]
    }
  }
}

Restart Claude → tools appear in the 🛠 menu.

Claude Code

claude mcp add hyperstore -- uvx hyperstore-mcp

Cursor

.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):

{
  "mcpServers": {
    "hyperstore": {
      "command": "uvx",
      "args": ["hyperstore-mcp"]
    }
  }
}

Windsurf

~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "hyperstore": {
      "command": "uvx",
      "args": ["hyperstore-mcp"]
    }
  }
}

Cline (VS Code)

settings.json:

{
  "cline.mcpServers": {
    "hyperstore": {
      "command": "uvx",
      "args": ["hyperstore-mcp"]
    }
  }
}

Zed

~/.config/zed/settings.json:

{
  "context_servers": {
    "hyperstore": {
      "command": {
        "path": "uvx",
        "args": ["hyperstore-mcp"]
      }
    }
  }
}

Gemini CLI

~/.gemini/settings.json:

{
  "mcpServers": {
    "hyperstore": {
      "command": "uvx",
      "args": ["hyperstore-mcp"]
    }
  }
}

ChatGPT (Pro / Team / Enterprise)

Settings → Connectors → Add custom connector:

  • Name: HyperStore
  • MCP Server URL: https://mcp.store.hypergpt.ai/mcp
  • Authentication: None

OpenAI Responses API

from openai import OpenAI

client = OpenAI()
response = client.responses.create(
    model="gpt-4.1",
    tools=[{
        "type": "mcp",
        "server_label": "hyperstore",
        "server_url": "https://mcp.store.hypergpt.ai/mcp",
        "require_approval": "never",
    }],
    input="Find me 3 free AI tools for writing unit tests.",
)
print(response.output_text)

Anthropic Messages API

from anthropic import Anthropic

client = Anthropic()
response = client.messages.create(
    model="claude-opus-4-7",
    max_tokens=1024,
    mcp_servers=[{
        "type": "url",
        "url": "https://mcp.store.hypergpt.ai/mcp",
        "name": "hyperstore",
    }],
    messages=[{"role": "user", "content": "Top 5 AI image generators?"}],
)

See examples/ for ready-to-paste configs for every supported client.


Run as a remote server

# Streamable HTTP (modern, ChatGPT/OpenAI/Anthropic)
hyperstore-mcp --transport http --host 0.0.0.0 --port 8080

# Legacy SSE (older MCP clients)
hyperstore-mcp --transport sse --port 8080

The hosted endpoint at https://mcp.store.hypergpt.ai runs the Docker image behind a CDN — no auth, rate-limited per IP.


Configuration

All settings come from environment variables (see .env.example):

Variable Default Purpose
HYPERSTORE_API_BASE https://store.hypergpt.ai Upstream API base URL
HYPERSTORE_TIMEOUT 20 HTTP timeout in seconds
HYPERSTORE_USER_AGENT hyperstore-mcp/{version} UA string
MCP_HOST 0.0.0.0 Bind host (http/sse only)
MCP_PORT 8080 Bind port (http/sse only)
LOG_LEVEL INFO Logging level

Development

git clone https://github.com/deficlow/HyperStore-MCP
cd HyperStore-MCP
uv sync --all-extras
uv run pytest
uv run hyperstore-mcp        # stdio mode for local testing

Inspect the running server with the official MCP Inspector:

npx @modelcontextprotocol/inspector uvx hyperstore-mcp

How it works

HyperStore MCP is a thin async wrapper around the HyperStore public REST API. It is read-only — no credentials, no writes, no PII. The same data that powers the website powers the MCP server. Updates land in your LLM the moment they land on the site.

LLM client ──MCP──▶ hyperstore-mcp ──HTTPS──▶ store.hypergpt.ai/api

License

MIT © HyperGPT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperstore_mcp-0.1.1.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperstore_mcp-0.1.1-py3-none-any.whl (13.9 kB view details)

Uploaded Python 3

File details

Details for the file hyperstore_mcp-0.1.1.tar.gz.

File metadata

  • Download URL: hyperstore_mcp-0.1.1.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hyperstore_mcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 13362b24941b330a7aea811e073971efaedffa612194bb9c3ca7095104e31cd0
MD5 91012498a35e09dcea5a5821b81452d7
BLAKE2b-256 8c691903e6fff9399d7165217b1554da35c548c15d9b17391485cf26de727306

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperstore_mcp-0.1.1.tar.gz:

Publisher: publish-pypi.yml on deficlow/HyperStore-MCP

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hyperstore_mcp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: hyperstore_mcp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hyperstore_mcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 478434bc2634192dbecd555f0028210042ad41ff4aae4d98f32e737927d9d4f6
MD5 179f4076d1e6ba027a8ffa50f353ccd9
BLAKE2b-256 3f8422d85b47461c5011cb4391c7d5af62acf6a78d896232898808601808e78a

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperstore_mcp-0.1.1-py3-none-any.whl:

Publisher: publish-pypi.yml on deficlow/HyperStore-MCP

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page