MCP server for validating and testing llguidance grammars (Lark format)
Project description
MCP Grammar Tools
MCP server for validating and testing llguidance grammars (Lark format). Provides grammar validation, batch test execution, and syntax documentation — ideal for iteratively building grammars with AI coding assistants.
Installation
With uvx (recommended)
uvx guidance-lark-mcp
With pip
pip install guidance-lark-mcp
From source
cd mcp-grammar-tools
pip install -e .
MCP Client Configuration
VS Code / Copilot CLI (~/.copilot/mcp-config.json)
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
},
"tools": ["*"]
}
}
}
Claude Desktop
{
"mcpServers": {
"grammar-tools": {
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
Usage
Available Tools
-
validate_grammar— Validate grammar completeness and consistency using llguidance's built-in validator.{"grammar": "start: \"hello\" \"world\""}
-
run_batch_validation_tests— Run batch validation tests from a JSON file against a grammar. Returns pass/fail statistics and detailed failure info.{ "grammar": "start: /[0-9]+/", "test_file": "tests.json" }
Test file format:
[ {"input": "123", "should_parse": true, "description": "Valid number"}, {"input": "abc", "should_parse": false, "description": "Not a number"} ]
-
get_llguidance_documentation— Fetch the llguidance grammar syntax documentation from the official repo. -
generate_with_grammar(optional, requiresENABLE_GENERATION=true) — Generate text using an OpenAI model constrained by a grammar. Uses the Responses API with custom tool grammar format, so output is guaranteed to conform to the grammar. RequiresOPENAI_API_KEYenvironment variable. See Backend Configuration for Azure and other endpoints.
Backend Configuration
The generate_with_grammar tool uses the OpenAI Python SDK, which natively supports multiple backends via environment variables:
| Backend | Required env vars | Optional env vars |
|---|---|---|
| OpenAI (default) | OPENAI_API_KEY |
OPENAI_MODEL |
| Azure OpenAI (API key) | AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY |
AZURE_OPENAI_API_VERSION, OPENAI_MODEL |
| Azure OpenAI (Entra ID) | AZURE_OPENAI_ENDPOINT + az login |
AZURE_OPENAI_API_VERSION, OPENAI_MODEL |
| Custom endpoint | OPENAI_API_KEY, OPENAI_BASE_URL |
OPENAI_MODEL |
The server auto-detects which backend to use:
- If
AZURE_OPENAI_ENDPOINTis set → usesAzureOpenAIclient (with Entra ID or API key) - Otherwise → uses
OpenAIclient (readsOPENAI_API_KEYandOPENAI_BASE_URLautomatically)
The server logs which backend it detects on startup.
Example: Azure OpenAI (API key)
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
"AZURE_OPENAI_API_KEY": "your-azure-key",
"OPENAI_MODEL": "gpt-4.1"
},
"tools": ["*"]
}
}
}
Example: Azure OpenAI (Entra ID / keyless)
Requires az login and the azure extra: pip install guidance-lark-mcp[azure]
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp[azure]"],
"env": {
"ENABLE_GENERATION": "true",
"AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
"OPENAI_MODEL": "gpt-4.1"
},
"tools": ["*"]
}
}
}
Example Workflow
Build a grammar iteratively with an AI assistant:
- Start with the spec — paste EBNF rules from a language specification
- Write a basic grammar — translate a few rules to Lark format
- Validate — use
validate_grammarto check for missing rules - Write tests — create a JSON test file with sample inputs
- Batch test — use
run_batch_validation_teststo find failures - Fix & repeat — refine the grammar until all tests pass
Example Grammars
The examples/ directory includes sample grammars built using these tools, with Lark grammar files, test suites, and documentation:
- GraphQL — executable subset of the GraphQL spec (queries, mutations, fragments, variables)
Development
git clone https://github.com/guidance-ai/guidance-lark-mcp
cd guidance-lark-mcp
uv sync
uv run pytest tests/ -q
Publishing
Releases are automated via GitHub Actions. To publish a new version:
git tag v0.1.0
git push origin v0.1.0
This triggers the release workflow which:
- Runs tests across Python 3.10–3.12
- Builds and publishes to PyPI (via Trusted Publishing)
- Publishes to the MCP Registry
- Creates a GitHub Release
PyPI publishing uses Trusted Publishing (OIDC) — no API tokens needed.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file guidance_lark_mcp-0.1.0.tar.gz.
File metadata
- Download URL: guidance_lark_mcp-0.1.0.tar.gz
- Upload date:
- Size: 16.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e57081cc2165a91cd4059a1c22f6cae77c233a00bed8ae00516624fbbd965a24
|
|
| MD5 |
c7a1f1c23270b679af2249a43d2c463f
|
|
| BLAKE2b-256 |
826bcf69dd7f94abdb6ebbdff87a38e7cea3028f19c262595d5f3a8b71eacf93
|
Provenance
The following attestation bundles were made for guidance_lark_mcp-0.1.0.tar.gz:
Publisher:
release.yml on guidance-ai/guidance-lark-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
guidance_lark_mcp-0.1.0.tar.gz -
Subject digest:
e57081cc2165a91cd4059a1c22f6cae77c233a00bed8ae00516624fbbd965a24 - Sigstore transparency entry: 1045043624
- Sigstore integration time:
-
Permalink:
guidance-ai/guidance-lark-mcp@ed1801f239fe7e73934a06c0fbb9968e3b1a5579 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/guidance-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ed1801f239fe7e73934a06c0fbb9968e3b1a5579 -
Trigger Event:
push
-
Statement type:
File details
Details for the file guidance_lark_mcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: guidance_lark_mcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 12.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d0be30c033e0a306a67bbf0c94dbdb33a55685edec42a3628e69d6aae895f081
|
|
| MD5 |
b50323be11f82b009f9e093a0136fb6d
|
|
| BLAKE2b-256 |
00203e72155d19da107019e21e877f4dc1677fe79cea734f2e47fafb51277425
|
Provenance
The following attestation bundles were made for guidance_lark_mcp-0.1.0-py3-none-any.whl:
Publisher:
release.yml on guidance-ai/guidance-lark-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
guidance_lark_mcp-0.1.0-py3-none-any.whl -
Subject digest:
d0be30c033e0a306a67bbf0c94dbdb33a55685edec42a3628e69d6aae895f081 - Sigstore transparency entry: 1045043685
- Sigstore integration time:
-
Permalink:
guidance-ai/guidance-lark-mcp@ed1801f239fe7e73934a06c0fbb9968e3b1a5579 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/guidance-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ed1801f239fe7e73934a06c0fbb9968e3b1a5579 -
Trigger Event:
push
-
Statement type: