Skip to main content

Mealie MCP Server for Agentic AI!

Project description

Mealie - A2A | AG-UI | MCP

PyPI - Version MCP Server PyPI - Downloads GitHub Repo stars GitHub forks GitHub contributors PyPI - License GitHub

GitHub last commit (by committer) GitHub pull requests GitHub closed pull requests GitHub issues

GitHub top language GitHub language count GitHub repo size GitHub repo file count (file type) PyPI - Wheel PyPI - Implementation

Version: 0.3.0

Overview

Mealie MCP Server + A2A Server

It includes a Model Context Protocol (MCP) server and an out of the box Agent2Agent (A2A) agent

Manage your self-hosted Mealie instance through an MCP server!

This repository is actively maintained - Contributions are welcome!

Supports:

  • User & Household Management
  • Recipe Management (CRUD, Import, Ratings)
  • Meal Planning (Organizer)
  • Shopping Lists
  • System Administration
  • Safe search levels (where applicable)
  • Pagination control

MCP

MCP Tools

Category Description Tag(s)
admin Manage administrative tasks admin
app Manage application settings and info app
explore Explore recipes and content explore
groups Manage recipe groups groups
households Manage households households
organizer Organize meals and plans organizer
recipe Manage individual recipes recipe
recipes Manage recipe collections recipes
shared Manage shared content shared
users Manage users users
utils Utility functions utils

Using as an MCP Server

The MCP Server can be run in two modes: stdio (for local testing) or http (for networked access). To start the server, use the following commands:

Run in stdio mode (default):

mealie-mcp --transport "stdio"

Run in HTTP mode:

mealie-mcp --transport "http"  --host "0.0.0.0"  --port "8000"

AI Prompt:

Find a recipe for lasagna

AI Response:

Found 3 recipes for "lasagna":
1. Classic Meat Lasagna
2. Vegetable Lasagna
3. Spinach Lasagna Rolls

A2A Agent

This package also includes an A2A agent server that can be used to interact with the Mealie MCP server.

Architecture:

---
config:
  layout: dagre
---
flowchart TB
 subgraph subGraph0["Agent Capabilities"]
        C["Agent"]
        B["A2A Server - Uvicorn/FastAPI"]
        D["MCP Tools"]
        F["Agent Skills"]
  end
    C --> D & F
    A["User Query"] --> B
    B --> C
    D --> E["Platform API"]

     C:::agent
     B:::server
     A:::server
    classDef server fill:#f9f,stroke:#333
    classDef agent fill:#bbf,stroke:#333,stroke-width:2px
    style B stroke:#000000,fill:#FFD600
    style D stroke:#000000,fill:#BBDEFB
    style F fill:#BBDEFB
    style A fill:#C8E6C9
    style subGraph0 fill:#FFF9C4

Component Interaction Diagram

sequenceDiagram
    participant User
    participant Server as A2A Server
    participant Agent as Agent
    participant Skill as Agent Skills
    participant MCP as MCP Tools

    User->>Server: Send Query
    Server->>Agent: Invoke Agent
    Agent->>Skill: Analyze Skills Available
    Skill->>Agent: Provide Guidance on Next Steps
    Agent->>MCP: Invoke Tool
    MCP-->>Agent: Tool Response Returned
    Agent-->>Agent: Return Results Summarized
    Agent-->>Server: Final Response
    Server-->>User: Output

Graph Architecture

This agent uses pydantic-graph orchestration for intelligent routing and optimal context management.

---
title: Mealie MCP Graph Agent
---
stateDiagram-v2
  [*] --> RouterNode: User Query
  RouterNode --> DomainNode: Classified Domain
  RouterNode --> [*]: Low confidence / Error
  DomainNode --> [*]: Domain Result
  • RouterNode: A fast, lightweight LLM (e.g., nvidia/nemotron-3-super) that classifies the user's query into one of the specialized domains.
  • DomainNode: The executor node. For the selected domain, it dynamically sets environment variables to temporarily enable ONLY the tools relevant to that domain, creating a highly focused sub-agent (e.g., gpt-4o) to complete the request. This preserves LLM context and prevents tool hallucination.

Usage

MCP CLI

Short Flag Long Flag Description
-h --help Display help information
-t --transport Transport method: 'stdio', 'http', or 'sse' [legacy] (default: stdio)
-s --host Host address for HTTP transport (default: 0.0.0.0)
-p --port Port number for HTTP transport (default: 8000)
--auth-type Authentication type: 'none', 'static', 'jwt', 'oauth-proxy', 'oidc-proxy', 'remote-oauth' (default: none)
--token-jwks-uri JWKS URI for JWT verification
--token-issuer Issuer for JWT verification
--token-audience Audience for JWT verification
--oauth-upstream-auth-endpoint Upstream authorization endpoint for OAuth Proxy
--oauth-upstream-token-endpoint Upstream token endpoint for OAuth Proxy
--oauth-upstream-client-id Upstream client ID for OAuth Proxy
--oauth-upstream-client-secret Upstream client secret for OAuth Proxy
--oauth-base-url Base URL for OAuth Proxy
--oidc-config-url OIDC configuration URL
--oidc-client-id OIDC client ID
--oidc-client-secret OIDC client secret
--oidc-base-url Base URL for OIDC Proxy
--remote-auth-servers Comma-separated list of authorization servers for Remote OAuth
--remote-base-url Base URL for Remote OAuth
--allowed-client-redirect-uris Comma-separated list of allowed client redirect URIs
--eunomia-type Eunomia authorization type: 'none', 'embedded', 'remote' (default: none)
--eunomia-policy-file Policy file for embedded Eunomia (default: mcp_policies.json)
--eunomia-remote-url URL for remote Eunomia server

A2A CLI

Endpoints

  • Web UI: http://localhost:8000/ (if enabled)
  • A2A: http://localhost:8000/a2a (Discovery: /a2a/.well-known/agent.json)
  • AG-UI: http://localhost:8000/ag-ui (POST)
Short Flag Long Flag Description
-h --help Display help information
--host Host to bind the server to (default: 0.0.0.0)
--port Port to bind the server to (default: 9000)
--reload Enable auto-reload
--provider LLM Provider: 'openai', 'anthropic', 'google', 'huggingface'
--model-id LLM Model ID (default: nvidia/nemotron-3-super)
--base-url LLM Base URL (for OpenAI compatible providers)
--api-key LLM API Key
--mcp-url MCP Server URL (default: http://localhost:8000/mcp)
--web Enable Pydantic AI Web UI

Using as an MCP Server

The MCP Server can be run in two modes: stdio (for local testing) or http (for networked access). To start the server, use the following commands:

Run in stdio mode (default):

mealie-mcp --transport "stdio"

Run in HTTP mode:

mealie-mcp --transport "http"  --host "0.0.0.0"  --port "8000"

AI Prompt:

Find a recipe for lasagna

AI Response:

Found 3 recipes for "lasagna":
1. Classic Meat Lasagna
2. Vegetable Lasagna
3. Spinach Lasagna Rolls

Agentic AI

mealie-mcp is designed to be used by Agentic AI systems. It provides a set of tools that allow agents to interact with Mealie.

Agent-to-Agent (A2A)

This package also includes an A2A agent server that can be used to interact with the Mealie MCP server.

CLI

Argument Description Default
--host Host to bind the server to 0.0.0.0
--port Port to bind the server to 9000
--reload Enable auto-reload False
--provider LLM Provider (openai, anthropic, google, huggingface) openai
--model-id LLM Model ID nvidia/nemotron-3-super
--base-url LLM Base URL (for OpenAI compatible providers) http://ollama.arpa/v1
--api-key LLM API Key ollama
--mcp-url MCP Server URL http://mealie-mcp:8000/mcp
--allowed-tools List of allowed MCP tools web_search

Examples

Run A2A Server

mealie-agent --provider openai --model-id gpt-4 --api-key sk-... --mcp-url http://localhost:8000/mcp

Run with Docker

docker run -e CMD=mealie-agent -p 8000:8000 mealie-mcp

Docker

Build

docker build -t mealie-mcp .

Run MCP Server

docker run -p 8000:8000 mealie-mcp

Run A2A Server

docker run -e CMD=mealie-agent -p 8001:8001 mealie-mcp

Deploy MCP Server as a Service

The Mealie MCP server can be deployed using Docker, with configurable authentication, middleware, and Eunomia authorization.

Using Docker Run

docker pull knucklessg1/mealie-mcp:latest

docker run -d \
  --name mealie-mcp \
  -p 8004:8004 \
  -e HOST=0.0.0.0 \
  -e PORT=8004 \
  -e TRANSPORT=http \
  -e AUTH_TYPE=none \
  -e EUNOMIA_TYPE=none \
  -e MEALIE_BASE_URL=https://mealie.example.com \
  -e MEALIE_TOKEN=your-token \
  -e MEALIE_SSL_VERIFY=true \
  knucklessg1/mealie-mcp:latest

For advanced authentication (e.g., JWT, OAuth Proxy, OIDC Proxy, Remote OAuth) or Eunomia, add the relevant environment variables:

docker run -d \
  --name mealie-mcp \
  -p 8004:8004 \
  -e HOST=0.0.0.0 \
  -e PORT=8004 \
  -e TRANSPORT=http \
  -e AUTH_TYPE=oidc-proxy \
  -e OIDC_CONFIG_URL=https://provider.com/.well-known/openid-configuration \
  -e OIDC_CLIENT_ID=your-client-id \
  -e OIDC_CLIENT_SECRET=your-client-secret \
  -e OIDC_BASE_URL=https://your-server.com \
  -e ALLOWED_CLIENT_REDIRECT_URIS=http://localhost:*,https://*.example.com/* \
  -e EUNOMIA_TYPE=embedded \
  -e EUNOMIA_POLICY_FILE=/app/mcp_policies.json \
  -e MEALIE_BASE_URL=https://mealie.example.com \
  -e MEALIE_TOKEN=your-token \
  -e MEALIE_SSL_VERIFY=true \
  knucklessg1/mealie-mcp:latest

Using Docker Compose

Create a docker-compose.yml file:

services:
  mealie-mcp:
    image: knucklessg1/mealie-mcp:latest
    environment:
      - HOST=0.0.0.0
      - PORT=8004
      - TRANSPORT=http
      - AUTH_TYPE=none
      - EUNOMIA_TYPE=none
      - MEALIE_BASE_URL=https://mealie.example.com
      - MEALIE_TOKEN=your-token
      - MEALIE_SSL_VERIFY=true
    ports:
      - 8004:8004

For advanced setups with authentication and Eunomia:

services:
  mealie-mcp:
    image: knucklessg1/mealie-mcp:latest
    environment:
      - HOST=0.0.0.0
      - PORT=8004
      - TRANSPORT=http
      - AUTH_TYPE=oidc-proxy
      - OIDC_CONFIG_URL=https://provider.com/.well-known/openid-configuration
      - OIDC_CLIENT_ID=your-client-id
      - OIDC_CLIENT_SECRET=your-client-secret
      - OIDC_BASE_URL=https://your-server.com
      - ALLOWED_CLIENT_REDIRECT_URIS=http://localhost:*,https://*.example.com/*
      - EUNOMIA_TYPE=embedded
      - EUNOMIA_POLICY_FILE=/app/mcp_policies.json
      - MEALIE_BASE_URL=https://mealie.example.com
      - MEALIE_TOKEN=your-token
      - MEALIE_SSL_VERIFY=true
    ports:
      - 8004:8004
    volumes:
      - ./mcp_policies.json:/app/mcp_policies.json

Run the service:

docker-compose up -d

Configure mcp.json for AI Integration

{
  "mcpServers": {
    "mealie": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mealie-mcp",
        "mealie-mcp"
      ],
      "env": {
        "MEALIE_BASE_URL": "https://mealie.example.com",
        "MEALIE_TOKEN": "your-token",
        "MEALIE_SSL_VERIFY": "true"
      },
      "timeout": 300000
    }
  }
}

Install Python Package

python -m pip install mealie-mcp
uv pip install mealie-mcp

Repository Owners

GitHub followers GitHub User's stars

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mealie_mcp-0.3.0.tar.gz (33.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mealie_mcp-0.3.0-py3-none-any.whl (30.1 kB view details)

Uploaded Python 3

File details

Details for the file mealie_mcp-0.3.0.tar.gz.

File metadata

  • Download URL: mealie_mcp-0.3.0.tar.gz
  • Upload date:
  • Size: 33.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for mealie_mcp-0.3.0.tar.gz
Algorithm Hash digest
SHA256 9e4f87934dadbb8a46d0cfab6d508cb1703c7ef24abccf32c2c8125826ed3809
MD5 ce22eb575b6de0c717d6d7382c0731cb
BLAKE2b-256 3d771e8164572841bd95471d545d3c13898ffba832a09bc6260853fdd0572bad

See more details on using hashes here.

File details

Details for the file mealie_mcp-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: mealie_mcp-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 30.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for mealie_mcp-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d641c3c7d007bd34d8779316c4a0c3dc51824313f695de5fb6b54e2093dc062b
MD5 53495937a3fb384c7031b147b6d76df3
BLAKE2b-256 76705a4f055b3ea6cec52cb1b2af0b2289226aa0a85772aeb2e2d9fb2dddf1fd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page