Skip to main content

MCP server for validating and testing llguidance grammars (Lark format)

Project description

MCP Grammar Tools

MCP server for validating and testing llguidance grammars (Lark format). Provides grammar validation, batch test execution, and syntax documentation — ideal for iteratively building grammars with AI coding assistants.

Installation

With uvx (recommended)

uvx guidance-lark-mcp

With pip

pip install guidance-lark-mcp

From source

cd mcp-grammar-tools
pip install -e .

MCP Client Configuration

VS Code / Copilot CLI (~/.copilot/mcp-config.json)

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "OPENAI_API_KEY": "your-key-here"
      },
      "tools": ["*"]
    }
  }
}

Claude Desktop

{
  "mcpServers": {
    "grammar-tools": {
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "OPENAI_API_KEY": "your-key-here"
      }
    }
  }
}

Usage

Available Tools

  1. validate_grammar — Validate grammar completeness and consistency using llguidance's built-in validator.

    {"grammar": "start: \"hello\" \"world\""}
    
  2. run_batch_validation_tests — Run batch validation tests from a JSON file against a grammar. Returns pass/fail statistics and detailed failure info.

    {
      "grammar": "start: /[0-9]+/",
      "test_file": "tests.json"
    }
    

    Test file format:

    [
      {"input": "123", "should_parse": true, "description": "Valid number"},
      {"input": "abc", "should_parse": false, "description": "Not a number"}
    ]
    
  3. get_llguidance_documentation — Fetch the llguidance grammar syntax documentation from the official repo.

  4. generate_with_grammar (optional, requires ENABLE_GENERATION=true) — Generate text using an OpenAI model constrained by a grammar. Uses the Responses API with custom tool grammar format, so output is guaranteed to conform to the grammar. Requires OPENAI_API_KEY environment variable. See Backend Configuration for Azure and other endpoints.

Backend Configuration

The generate_with_grammar tool uses the OpenAI Python SDK, which natively supports multiple backends via environment variables:

Backend Required env vars Optional env vars
OpenAI (default) OPENAI_API_KEY OPENAI_MODEL
Azure OpenAI (API key) AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY AZURE_OPENAI_API_VERSION, OPENAI_MODEL
Azure OpenAI (Entra ID) AZURE_OPENAI_ENDPOINT + az login AZURE_OPENAI_API_VERSION, OPENAI_MODEL
Custom endpoint OPENAI_API_KEY, OPENAI_BASE_URL OPENAI_MODEL

The server auto-detects which backend to use:

  • If AZURE_OPENAI_ENDPOINT is set → uses AzureOpenAI client (with Entra ID or API key)
  • Otherwise → uses OpenAI client (reads OPENAI_API_KEY and OPENAI_BASE_URL automatically)

The server logs which backend it detects on startup.

Example: Azure OpenAI (API key)

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
        "AZURE_OPENAI_API_KEY": "your-azure-key",
        "OPENAI_MODEL": "gpt-4.1"
      },
      "tools": ["*"]
    }
  }
}

Example: Azure OpenAI (Entra ID / keyless)

Requires az login and the azure extra: pip install guidance-lark-mcp[azure]

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp[azure]"],
      "env": {
        "ENABLE_GENERATION": "true",
        "AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
        "OPENAI_MODEL": "gpt-4.1"
      },
      "tools": ["*"]
    }
  }
}

Example Workflow

Build a grammar iteratively with an AI assistant:

  1. Start with the spec — paste EBNF rules from a language specification
  2. Write a basic grammar — translate a few rules to Lark format
  3. Validate — use validate_grammar to check for missing rules
  4. Write tests — create a JSON test file with sample inputs
  5. Batch test — use run_batch_validation_tests to find failures
  6. Fix & repeat — refine the grammar until all tests pass

Example Grammars

The examples/ directory includes sample grammars built using these tools, with Lark grammar files, test suites, and documentation:

  • GraphQL — executable subset of the GraphQL spec (queries, mutations, fragments, variables)

Development

git clone https://github.com/guidance-ai/guidance-lark-mcp
cd guidance-lark-mcp
uv sync
uv run pytest tests/ -q

Publishing

Releases are automated via GitHub Actions. To publish a new version:

git tag v0.1.0
git push origin v0.1.0

This triggers the release workflow which:

  1. Runs tests across Python 3.10–3.12
  2. Builds and publishes to PyPI (via Trusted Publishing)
  3. Publishes to the MCP Registry
  4. Creates a GitHub Release

PyPI publishing uses Trusted Publishing (OIDC) — no API tokens needed.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

guidance_lark_mcp-0.1.0.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

guidance_lark_mcp-0.1.0-py3-none-any.whl (12.0 kB view details)

Uploaded Python 3

File details

Details for the file guidance_lark_mcp-0.1.0.tar.gz.

File metadata

  • Download URL: guidance_lark_mcp-0.1.0.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for guidance_lark_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e57081cc2165a91cd4059a1c22f6cae77c233a00bed8ae00516624fbbd965a24
MD5 c7a1f1c23270b679af2249a43d2c463f
BLAKE2b-256 826bcf69dd7f94abdb6ebbdff87a38e7cea3028f19c262595d5f3a8b71eacf93

See more details on using hashes here.

Provenance

The following attestation bundles were made for guidance_lark_mcp-0.1.0.tar.gz:

Publisher: release.yml on guidance-ai/guidance-lark-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file guidance_lark_mcp-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for guidance_lark_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d0be30c033e0a306a67bbf0c94dbdb33a55685edec42a3628e69d6aae895f081
MD5 b50323be11f82b009f9e093a0136fb6d
BLAKE2b-256 00203e72155d19da107019e21e877f4dc1677fe79cea734f2e47fafb51277425

See more details on using hashes here.

Provenance

The following attestation bundles were made for guidance_lark_mcp-0.1.0-py3-none-any.whl:

Publisher: release.yml on guidance-ai/guidance-lark-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page