Skip to main content

MCP server exposing Nova Mesh agents and skills to Claude Desktop, Cursor, VS Code, and any MCP-compatible client

Project description

meganova-mcp-server

MCP server for MegaNova AI — expose LLM inference, image/video generation, embeddings, cloud agents, and Nova Mesh routing to any MCP-compatible client (Claude Desktop, Cursor, VS Code, Windsurf, etc.).

What It Does

MCP Primitive Capabilities
Tools — Inference chat_completion, create_embeddings, rerank_documents
Tools — Media generate_image, generate_video, transcribe_audio
Tools — Models list_models (with filtering by modality, family, use case)
Tools — Cloud Agents cloud_agent_info, cloud_agent_chat, cloud_agent_confirm_tool
Tools — Nova Mesh route_task, chat_with_agent, list_agents, get_agent_info, execute_dag, get_call_log
Tools — Skills list_skills, execute_skill, search_catalog
Resources nova://agents (index), nova://agents/{name} (profile)
Prompts route_task_prompt, agent_chat_prompt, skill_discovery_prompt

Quick Start

Install

pip install meganova-mcp-server

Or from source:

git clone https://github.com/MeganovaAI/meganova-mcp-server.git
cd meganova-mcp-server
pip install -e .

Configure

Set environment variables:

export MEGANOVA_API_KEY=your-api-key                          # Required
export MEGANOVA_API_URL=https://api.meganova.ai/v1            # Default
export MEGANOVA_STUDIO_URL=https://studio-api.meganova.ai     # Default
export NOVA_MESH_URL=https://your-mesh-endpoint               # Optional, for mesh tools

Run (stdio)

meganova-mcp-server

Run (HTTP)

MCP_TRANSPORT=http MCP_PORT=8080 meganova-mcp-server

Client Setup

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "meganova": {
      "command": "meganova-mcp-server",
      "env": {
        "MEGANOVA_API_KEY": "your-api-key"
      }
    }
  }
}

Claude Code

Add to .mcp.json in your project:

{
  "mcpServers": {
    "meganova": {
      "command": "meganova-mcp-server",
      "env": {
        "MEGANOVA_API_KEY": "your-api-key"
      }
    }
  }
}

Cursor / Windsurf

Use the HTTP transport and point to http://localhost:8080/mcp.

Tool Reference

Inference

  • chat_completion — Send messages to any MegaNova LLM (Manta, Qwen, Gemini, Deepseek, etc.)
  • create_embeddings — Generate text embeddings for semantic search and RAG
  • rerank_documents — Rerank documents by relevance to a query

Media

  • generate_image — Text-to-image with FLUX, SeedDream, and other models
  • generate_video — Text-to-video generation
  • transcribe_audio — Audio-to-text transcription from URL

Model Discovery

  • list_models — Browse available models with filters (modality, family, pricing)

Cloud Agents

  • cloud_agent_info — Get info about a deployed Studio agent
  • cloud_agent_chat — Chat with a deployed agent (supports multi-turn conversations)
  • cloud_agent_confirm_tool — Approve or reject pending agent tool calls

Nova Mesh

  • route_task — Auto-route tasks to the best agent in the mesh
  • chat_with_agent — Direct message to a specific mesh agent
  • list_agents / get_agent_info — Discover mesh agents
  • execute_dag — Run multi-step DAG execution plans
  • get_call_log — View recent LLM call history

Skills

  • list_skills / search_catalog — Discover available skill packs
  • execute_skill — Run a specific tool from a skill pack

Docker

docker build -t meganova-mcp-server .
docker run -e MEGANOVA_API_KEY=key -p 8080:8080 meganova-mcp-server

Development

pip install -e ".[dev]"
pytest

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meganova_mcp_server-0.3.0.tar.gz (15.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meganova_mcp_server-0.3.0-py3-none-any.whl (23.8 kB view details)

Uploaded Python 3

File details

Details for the file meganova_mcp_server-0.3.0.tar.gz.

File metadata

  • Download URL: meganova_mcp_server-0.3.0.tar.gz
  • Upload date:
  • Size: 15.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for meganova_mcp_server-0.3.0.tar.gz
Algorithm Hash digest
SHA256 b60274bf14ee4629a8ebb55400537befdb548e3a32e6ed1bf4f8a5266686aff3
MD5 2984289cc26c928279220c9471d967af
BLAKE2b-256 e737b96ae59d9d2505819d2e160e496ed52d525b5a577864cf0af250ef197059

See more details on using hashes here.

File details

Details for the file meganova_mcp_server-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for meganova_mcp_server-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fea6f19d75cc6ca38ed628270c39a790f388640a49b6c6b69ef15b3b62948390
MD5 ca717a5913832e0b50052ae304bd8a5c
BLAKE2b-256 1b5426c7c7224c1ed2d6a20fc1834c6b7612ebeeceaa1bbab4f0c4bb2bd098e0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page