Skip to main content

MCP server exposing Nova Mesh agents and skills to Claude Desktop, Cursor, VS Code, and any MCP-compatible client

Project description

meganova-mcp-server

MCP server for MegaNova AI — expose LLM inference, image/video generation, embeddings, cloud agents, and Nova Mesh routing to any MCP-compatible client (Claude Desktop, Cursor, VS Code, Windsurf, etc.).

What It Does

MCP Primitive Capabilities
Tools — Inference chat_completion, create_embeddings, rerank_documents
Tools — Media generate_image, generate_video, transcribe_audio
Tools — Models list_models (with filtering by modality, family, use case)
Tools — Cloud Agents cloud_agent_info, cloud_agent_chat, cloud_agent_confirm_tool
Tools — Nova Mesh route_task, chat_with_agent, list_agents, get_agent_info, execute_dag, get_call_log
Tools — Skills list_skills, execute_skill, search_catalog
Resources nova://agents (index), nova://agents/{name} (profile)
Prompts route_task_prompt, agent_chat_prompt, skill_discovery_prompt

Quick Start

Install

pip install meganova-mcp-server

Or from source:

git clone https://github.com/MeganovaAI/meganova-mcp-server.git
cd meganova-mcp-server
pip install -e .

Configure

Set environment variables:

export MEGANOVA_API_KEY=your-api-key                          # Required
export MEGANOVA_API_URL=https://api.meganova.ai/v1            # Default
export MEGANOVA_STUDIO_URL=https://studio-api.meganova.ai     # Default
export NOVA_MESH_URL=https://your-mesh-endpoint               # Optional, for mesh tools

Run (stdio)

meganova-mcp-server

Run (HTTP)

MCP_TRANSPORT=http MCP_PORT=8080 meganova-mcp-server

Client Setup

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "meganova": {
      "command": "meganova-mcp-server",
      "env": {
        "MEGANOVA_API_KEY": "your-api-key"
      }
    }
  }
}

Claude Code

Add to .mcp.json in your project:

{
  "mcpServers": {
    "meganova": {
      "command": "meganova-mcp-server",
      "env": {
        "MEGANOVA_API_KEY": "your-api-key"
      }
    }
  }
}

Cursor / Windsurf

Use the HTTP transport and point to http://localhost:8080/mcp.

Tool Reference

Inference

  • chat_completion — Send messages to any MegaNova LLM (Manta, Qwen, Gemini, Deepseek, etc.)
  • create_embeddings — Generate text embeddings for semantic search and RAG
  • rerank_documents — Rerank documents by relevance to a query

Media

  • generate_image — Text-to-image with FLUX, SeedDream, and other models
  • generate_video — Text-to-video generation
  • transcribe_audio — Audio-to-text transcription from URL

Model Discovery

  • list_models — Browse available models with filters (modality, family, pricing)

Cloud Agents

  • cloud_agent_info — Get info about a deployed Studio agent
  • cloud_agent_chat — Chat with a deployed agent (supports multi-turn conversations)
  • cloud_agent_confirm_tool — Approve or reject pending agent tool calls

Nova Mesh

  • route_task — Auto-route tasks to the best agent in the mesh
  • chat_with_agent — Direct message to a specific mesh agent
  • list_agents / get_agent_info — Discover mesh agents
  • execute_dag — Run multi-step DAG execution plans
  • get_call_log — View recent LLM call history

Skills

  • list_skills / search_catalog — Discover available skill packs
  • execute_skill — Run a specific tool from a skill pack

Docker

docker build -t meganova-mcp-server .
docker run -e MEGANOVA_API_KEY=key -p 8080:8080 meganova-mcp-server

Development

pip install -e ".[dev]"
pytest

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meganova_mcp_server-0.2.0.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meganova_mcp_server-0.2.0-py3-none-any.whl (18.6 kB view details)

Uploaded Python 3

File details

Details for the file meganova_mcp_server-0.2.0.tar.gz.

File metadata

  • Download URL: meganova_mcp_server-0.2.0.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for meganova_mcp_server-0.2.0.tar.gz
Algorithm Hash digest
SHA256 7b42340e051e0b7243d963a3ead5198fd3c459fb1f2bbf02a02a426f80599135
MD5 a82b4ea81cde3d2bf2b0f4cf3f91e1c2
BLAKE2b-256 ead0902743a6dbbfca117e77501fa68391b1fa4e87e71cf73c61ac0fae96dfeb

See more details on using hashes here.

File details

Details for the file meganova_mcp_server-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for meganova_mcp_server-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fa75203fb849cac5426a6d15f2979cd356491b8875520abb74e71d83ad386e18
MD5 05c4887988f2c4fcea90763a3d95211d
BLAKE2b-256 fd270fa9b9c4cf1409343e3c1b4ac4472517b94228bbd8308abf77ac77ded3e3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page