MCP server for validating and testing llguidance grammars (Lark format)
Project description
MCP Grammar Tools
MCP server for validating and testing llguidance grammars (Lark format). Provides grammar validation, batch test execution, and syntax documentation — ideal for iteratively building grammars with AI coding assistants.
Installation
With uvx (recommended)
uvx guidance-lark-mcp
With pip
pip install guidance-lark-mcp
From source
cd mcp-grammar-tools
pip install -e .
MCP Client Configuration
GitHub Copilot CLI
You can add the server using the interactive /mcp add command or by editing the config file directly. See the Copilot CLI MCP documentation for full details.
Option 1: Interactive setup
In the Copilot CLI, run /mcp add, select Local/STDIO, and enter uvx guidance-lark-mcp as the command.
Option 2: Edit config file
Add the following to ~/.copilot/mcp-config.json:
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"tools": ["*"]
}
}
}
This gives you grammar validation and batch testing out of the box. To also enable LLM-powered generation (generate_with_grammar), add ENABLE_GENERATION and your credentials to env:
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
}
For Azure OpenAI (with Entra ID via az login), use guidance-lark-mcp[azure] and set the endpoint instead:
"args": ["guidance-lark-mcp[azure]"],
"env": {
"ENABLE_GENERATION": "true",
"AZURE_OPENAI_ENDPOINT": "https://your-resource.openai.azure.com/",
"OPENAI_MODEL": "your-deployment-name"
}
See Backend Configuration for all supported backends.
After saving, use /mcp show to verify the server is connected.
VS Code
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
},
"tools": ["*"]
}
}
}
Claude Desktop
{
"mcpServers": {
"grammar-tools": {
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
Usage
Available Tools
-
validate_grammar— Validate grammar completeness and consistency using llguidance's built-in validator.{"grammar": "start: \"hello\" \"world\""}
-
run_batch_validation_tests— Run batch validation tests from a JSON file against a grammar. Returns pass/fail statistics and detailed failure info.{ "grammar": "start: /[0-9]+/", "test_file": "tests.json" }
Test file format:
[ {"input": "123", "should_parse": true, "description": "Valid number"}, {"input": "abc", "should_parse": false, "description": "Not a number"} ]
-
get_llguidance_documentation— Fetch the llguidance grammar syntax documentation from the official repo. -
generate_with_grammar(optional, requiresENABLE_GENERATION=true) — Generate text using an OpenAI model constrained by a grammar. Uses the Responses API with custom tool grammar format, so output is guaranteed to conform to the grammar. RequiresOPENAI_API_KEYenvironment variable. See Backend Configuration for Azure and other endpoints.
Backend Configuration
The generate_with_grammar tool uses the OpenAI Python SDK, which natively supports multiple backends via environment variables:
| Backend | Required env vars | Optional env vars |
|---|---|---|
| OpenAI (default) | OPENAI_API_KEY |
OPENAI_MODEL |
| Azure OpenAI (API key) | AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY |
AZURE_OPENAI_API_VERSION, OPENAI_MODEL |
| Azure OpenAI (Entra ID) | AZURE_OPENAI_ENDPOINT + az login |
AZURE_OPENAI_API_VERSION, OPENAI_MODEL |
| Custom endpoint | OPENAI_API_KEY, OPENAI_BASE_URL |
OPENAI_MODEL |
The server auto-detects which backend to use:
- If
AZURE_OPENAI_ENDPOINTis set → usesAzureOpenAIclient (with Entra ID or API key) - Otherwise → uses
OpenAIclient (readsOPENAI_API_KEYandOPENAI_BASE_URLautomatically)
The server logs which backend it detects on startup.
Example: Azure OpenAI (API key)
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
"AZURE_OPENAI_API_KEY": "your-azure-key",
"OPENAI_MODEL": "gpt-4.1"
},
"tools": ["*"]
}
}
}
Example: Azure OpenAI (Entra ID / keyless)
Requires az login and the azure extra: pip install guidance-lark-mcp[azure]
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp[azure]"],
"env": {
"ENABLE_GENERATION": "true",
"AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
"OPENAI_MODEL": "gpt-4.1"
},
"tools": ["*"]
}
}
}
Example Workflow
Build a grammar iteratively with an AI assistant:
- Start with the spec — paste EBNF rules from a language specification
- Write a basic grammar — translate a few rules to Lark format
- Validate — use
validate_grammarto check for missing rules - Write tests — create a JSON test file with sample inputs
- Batch test — use
run_batch_validation_teststo find failures - Fix & repeat — refine the grammar until all tests pass
Example Grammars
The examples/ directory includes sample grammars built using these tools, with Lark grammar files, test suites, and documentation:
- GraphQL — executable subset of the GraphQL spec (queries, mutations, fragments, variables)
Troubleshooting
Server fails to connect in Copilot CLI / VS Code?
MCP clients like Copilot CLI only show "Connection closed" when a server crashes on startup. To see the actual error, run the server directly in your terminal:
uvx guidance-lark-mcp
Or with generation enabled:
ENABLE_GENERATION=true OPENAI_API_KEY=your-key uvx guidance-lark-mcp
Common issues:
- Missing credentials —
ENABLE_GENERATION=truewithout a validOPENAI_API_KEYorAZURE_OPENAI_ENDPOINT. The server will still start and serve validation tools;generate_with_grammarwill return a descriptive error. - Azure Entra ID — make sure you've run
az loginand are usingguidance-lark-mcp[azure](not the base package). - Slow first start —
uvxneeds to resolve and install dependencies on first run, which may exceed the MCP client's connection timeout. Runuvx guidance-lark-mcponce manually to warm the cache.
Development
git clone https://github.com/guidance-ai/guidance-lark-mcp
cd guidance-lark-mcp
uv sync
uv run pytest tests/ -q
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file guidance_lark_mcp-0.1.2.tar.gz.
File metadata
- Download URL: guidance_lark_mcp-0.1.2.tar.gz
- Upload date:
- Size: 17.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
53343509b6f4f6b9949ba034dc334ed80e1a5922778224db52ea5587bd4dac01
|
|
| MD5 |
77d4bb66cf4df811d011dadd8bfeff6f
|
|
| BLAKE2b-256 |
f40b3051c8f05ea3fe12855e459d3db8ed14a49c5aac73cccff1f5ad37ad880d
|
Provenance
The following attestation bundles were made for guidance_lark_mcp-0.1.2.tar.gz:
Publisher:
release.yml on guidance-ai/guidance-lark-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
guidance_lark_mcp-0.1.2.tar.gz -
Subject digest:
53343509b6f4f6b9949ba034dc334ed80e1a5922778224db52ea5587bd4dac01 - Sigstore transparency entry: 1046816152
- Sigstore integration time:
-
Permalink:
guidance-ai/guidance-lark-mcp@42a333d46fb95039c176efc4d5d813902eb13321 -
Branch / Tag:
refs/tags/v0.1.2 - Owner: https://github.com/guidance-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@42a333d46fb95039c176efc4d5d813902eb13321 -
Trigger Event:
push
-
Statement type:
File details
Details for the file guidance_lark_mcp-0.1.2-py3-none-any.whl.
File metadata
- Download URL: guidance_lark_mcp-0.1.2-py3-none-any.whl
- Upload date:
- Size: 12.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8b0b6b2eb643db6a4f71e7be29c5132b230d1174f7559aad581aded0040dd283
|
|
| MD5 |
46fb766e9bbea4effdff5efd11acf437
|
|
| BLAKE2b-256 |
9927ba6a5243074649c5bbfefdbff714b34fd679afc0e8e67159cf7b6bd26cfd
|
Provenance
The following attestation bundles were made for guidance_lark_mcp-0.1.2-py3-none-any.whl:
Publisher:
release.yml on guidance-ai/guidance-lark-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
guidance_lark_mcp-0.1.2-py3-none-any.whl -
Subject digest:
8b0b6b2eb643db6a4f71e7be29c5132b230d1174f7559aad581aded0040dd283 - Sigstore transparency entry: 1046816160
- Sigstore integration time:
-
Permalink:
guidance-ai/guidance-lark-mcp@42a333d46fb95039c176efc4d5d813902eb13321 -
Branch / Tag:
refs/tags/v0.1.2 - Owner: https://github.com/guidance-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@42a333d46fb95039c176efc4d5d813902eb13321 -
Trigger Event:
push
-
Statement type: