Skip to main content

MCP server for validating and testing llguidance grammars (Lark format)

Project description

MCP Grammar Tools

MCP server for validating and testing llguidance grammars (Lark format). Provides grammar validation, batch test execution, and syntax documentation — ideal for iteratively building grammars with AI coding assistants.

Installation

With uvx (recommended)

uvx guidance-lark-mcp

With pip

pip install guidance-lark-mcp

From source

cd mcp-grammar-tools
pip install -e .

MCP Client Configuration

GitHub Copilot CLI

You can add the server using the interactive /mcp add command or by editing the config file directly. See the Copilot CLI MCP documentation for full details.

Option 1: Interactive setup

In the Copilot CLI, run /mcp add, select Local/STDIO, and enter uvx guidance-lark-mcp as the command.

Option 2: Edit config file

Add the following to ~/.copilot/mcp-config.json:

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "tools": ["*"]
    }
  }
}

This gives you grammar validation and batch testing out of the box. To also enable LLM-powered generation (generate_with_grammar), add ENABLE_GENERATION and your credentials to env:

"env": {
  "ENABLE_GENERATION": "true",
  "OPENAI_API_KEY": "your-key-here"
}

For Azure OpenAI (with Entra ID via az login), use guidance-lark-mcp[azure] and set the endpoint instead:

"args": ["guidance-lark-mcp[azure]"],
"env": {
  "ENABLE_GENERATION": "true",
  "AZURE_OPENAI_ENDPOINT": "https://your-resource.openai.azure.com/",
  "OPENAI_MODEL": "your-deployment-name"
}

See Backend Configuration for all supported backends.

After saving, use /mcp show to verify the server is connected.

VS Code

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "OPENAI_API_KEY": "your-key-here"
      },
      "tools": ["*"]
    }
  }
}

Claude Desktop

{
  "mcpServers": {
    "grammar-tools": {
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "OPENAI_API_KEY": "your-key-here"
      }
    }
  }
}

Usage

Available Tools

  1. validate_grammar — Validate grammar completeness and consistency using llguidance's built-in validator.

    {"grammar": "start: \"hello\" \"world\""}
    
  2. run_batch_validation_tests — Run batch validation tests from a JSON file against a grammar. Returns pass/fail statistics and detailed failure info.

    {
      "grammar": "start: /[0-9]+/",
      "test_file": "tests.json"
    }
    

    Test file format:

    [
      {"input": "123", "should_parse": true, "description": "Valid number"},
      {"input": "abc", "should_parse": false, "description": "Not a number"}
    ]
    
  3. get_llguidance_documentation — Fetch the llguidance grammar syntax documentation from the official repo.

  4. generate_with_grammar (optional, requires ENABLE_GENERATION=true) — Generate text using an OpenAI model constrained by a grammar. Uses the Responses API with custom tool grammar format, so output is guaranteed to conform to the grammar. Requires OPENAI_API_KEY environment variable. See Backend Configuration for Azure and other endpoints.

Backend Configuration

The generate_with_grammar tool uses the OpenAI Python SDK, which natively supports multiple backends via environment variables:

Backend Required env vars Optional env vars
OpenAI (default) OPENAI_API_KEY OPENAI_MODEL
Azure OpenAI (API key) AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY AZURE_OPENAI_API_VERSION, OPENAI_MODEL
Azure OpenAI (Entra ID) AZURE_OPENAI_ENDPOINT + az login AZURE_OPENAI_API_VERSION, OPENAI_MODEL
Custom endpoint OPENAI_API_KEY, OPENAI_BASE_URL OPENAI_MODEL

The server auto-detects which backend to use:

  • If AZURE_OPENAI_ENDPOINT is set → uses AzureOpenAI client (with Entra ID or API key)
  • Otherwise → uses OpenAI client (reads OPENAI_API_KEY and OPENAI_BASE_URL automatically)

The server logs which backend it detects on startup.

Example: Azure OpenAI (API key)

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
        "AZURE_OPENAI_API_KEY": "your-azure-key",
        "OPENAI_MODEL": "gpt-4.1"
      },
      "tools": ["*"]
    }
  }
}

Example: Azure OpenAI (Entra ID / keyless)

Requires az login and the azure extra: pip install guidance-lark-mcp[azure]

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp[azure]"],
      "env": {
        "ENABLE_GENERATION": "true",
        "AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
        "OPENAI_MODEL": "gpt-4.1"
      },
      "tools": ["*"]
    }
  }
}

Example Workflow

Build a grammar iteratively with an AI assistant:

  1. Start with the spec — paste EBNF rules from a language specification
  2. Write a basic grammar — translate a few rules to Lark format
  3. Validate — use validate_grammar to check for missing rules
  4. Write tests — create a JSON test file with sample inputs
  5. Batch test — use run_batch_validation_tests to find failures
  6. Fix & repeat — refine the grammar until all tests pass

Example Grammars

The examples/ directory includes sample grammars built using these tools, with Lark grammar files, test suites, and documentation:

  • GraphQL — executable subset of the GraphQL spec (queries, mutations, fragments, variables)

Troubleshooting

Server fails to connect in Copilot CLI / VS Code?

MCP clients like Copilot CLI only show "Connection closed" when a server crashes on startup. To see the actual error, run the server directly in your terminal:

uvx guidance-lark-mcp

Or with generation enabled:

ENABLE_GENERATION=true OPENAI_API_KEY=your-key uvx guidance-lark-mcp

Common issues:

  • Missing credentialsENABLE_GENERATION=true without a valid OPENAI_API_KEY or AZURE_OPENAI_ENDPOINT. The server will still start and serve validation tools; generate_with_grammar will return a descriptive error.
  • Azure Entra ID — make sure you've run az login and are using guidance-lark-mcp[azure] (not the base package).
  • Slow first startuvx needs to resolve and install dependencies on first run, which may exceed the MCP client's connection timeout. Run uvx guidance-lark-mcp once manually to warm the cache.

Development

git clone https://github.com/guidance-ai/guidance-lark-mcp
cd guidance-lark-mcp
uv sync
uv run pytest tests/ -q

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

guidance_lark_mcp-0.1.2.tar.gz (17.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

guidance_lark_mcp-0.1.2-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file guidance_lark_mcp-0.1.2.tar.gz.

File metadata

  • Download URL: guidance_lark_mcp-0.1.2.tar.gz
  • Upload date:
  • Size: 17.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for guidance_lark_mcp-0.1.2.tar.gz
Algorithm Hash digest
SHA256 53343509b6f4f6b9949ba034dc334ed80e1a5922778224db52ea5587bd4dac01
MD5 77d4bb66cf4df811d011dadd8bfeff6f
BLAKE2b-256 f40b3051c8f05ea3fe12855e459d3db8ed14a49c5aac73cccff1f5ad37ad880d

See more details on using hashes here.

Provenance

The following attestation bundles were made for guidance_lark_mcp-0.1.2.tar.gz:

Publisher: release.yml on guidance-ai/guidance-lark-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file guidance_lark_mcp-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for guidance_lark_mcp-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8b0b6b2eb643db6a4f71e7be29c5132b230d1174f7559aad581aded0040dd283
MD5 46fb766e9bbea4effdff5efd11acf437
BLAKE2b-256 9927ba6a5243074649c5bbfefdbff714b34fd679afc0e8e67159cf7b6bd26cfd

See more details on using hashes here.

Provenance

The following attestation bundles were made for guidance_lark_mcp-0.1.2-py3-none-any.whl:

Publisher: release.yml on guidance-ai/guidance-lark-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page