Provider and client helpers for signed LLM transcript verification
Project description
llm_sign
llm_sign defines a provider-signed transcript chain for LLM interactions.
The project specifies cryptographic claims over canonicalized LLM interaction semantics. The core v1 protocol defines digest construction, block encoding, signature semantics, chain validation, verifier failure behavior, and an X.509/TLS-style CA profile for authenticating transcript signing keys. It deliberately avoids binding the protocol to a transport format, vendor API schema, or tokenizer representation.
Start here:
Package Layout
src/llm_sign/
client/ Client-facing verification facade.
server/ Server/provider-facing signing facade.
core/ Protocol encoding, block signing, and chain verification.
profiles/ Canonicalization profiles, including OpenAI Chat Completions.
keys/ Static Ed25519 and X.509 CA-mode key policies.
platforms/ Codex CLI, Kimi CLI, vLLM, and OpenAI-compatible adapters.
vendor/ Backward-compatible provider TLS helpers.
verifier.py High-level artifact verification API.
cli.py llm-sign-verify console entry point.
Preferred application entry points are llm_sign.client.* for verification and
llm_sign.server.* for signing. Compatibility shims remain at
llm_sign.blocks, llm_sign.openai, llm_sign.keys, llm_sign.pki, and
llm_sign.vendor.
For vLLM-style providers, load the same files passed to vllm serve:
import llm_sign
credential = llm_sign.server.TLSCertificateCredential.from_files(
ssl_certfile="/etc/letsencrypt/live/example.com/fullchain.pem",
ssl_keyfile="/etc/letsencrypt/live/example.com/privkey.pem",
)
signer = credential.signer()
The certificate key type determines the default signing suite. The built-in suites cover RSA-PSS/SHA-256, P-256 ECDSA/SHA-256, and Ed25519/SHA-256. New suites can be registered without changing the chain verification logic.
For relay or gateway deployments, the client does not need a manually imported
supplier public key. The OpenAI-compatible response can include both
llm_sign.artifact and the supplier llm_sign.certificate_chain. The verifier
validates that chain against configured trust anchors and then verifies the
artifact under the supplier leaf certificate public key. The relay's HTTPS
certificate authenticates the transport hop only; it is not the supplier signing
identity.
Clients that need backward compatibility with unsigned providers can use
llm_sign.client.verify_openai_response_signature(...). It returns an optional
signature report instead of raising when the response has no llm_sign data:
has_signature says whether signature data exists, host_name is the
claimed supplier host name, and valid is true, false, or null when there was
no signature to verify.
Python Usage
Install in editable mode:
python3 -m pip install -e .
Sign and verify an OpenAI Chat Completions compatible turn:
import llm_sign
keys = llm_sign.server.generate_ed25519_key_pair()
issuer = "provider.example"
signer = llm_sign.server.create_signer(
issuer=issuer,
key_id=keys.key_id,
private_key=keys.private_key,
)
request = {
"model": "gpt-4.1-mini",
"messages": [{"role": "user", "content": "Say hello"}],
}
response = {
"model": "gpt-4.1-mini",
"choices": [
{
"index": 0,
"finish_reason": "stop",
"message": {"role": "assistant", "content": "Hello."},
}
],
}
artifact = llm_sign.server.sign_openai_chat_turn(
request=request,
response=response,
signer=signer,
)
result = llm_sign.client.verify_with_public_key(
artifact,
issuer=issuer,
key_id=keys.key_id,
public_key=keys.public_key,
)
assert result.valid, result.errors
Multi-turn chains can be produced with llm_sign.server.sign_openai_chat_turns.
The tests include a four-block input/output/input/output conversation.
Examples
Runnable examples live in example/:
offline_openai_chat_verify.pyverifies a bundled signed OpenAI-compatible response without network access.openai_client_verify.pycalls an OpenAI-compatible endpoint with the OpenAI SDK, reportshas_signature,host_name, andvalid, and continues when the endpoint does not yet returnllm_signdata.tamper_detection.pyshows payload digest mismatch detection after a signed response is modified.
Run tests:
PYTHONPATH=src python3 -m unittest discover -s tests
Verify a platform artifact with a pinned public key:
llm-sign-verify artifact.json \
--issuer provider.example \
--public-key provider-ed25519-public.pem
Verify an artifact with a supplier certificate chain. In relay deployments,
extract this chain from llm_sign.certificate_chain in the first signed
response:
llm-sign-verify artifact.json \
--issuer example.com \
--certificate-chain supplier-chain.pem \
--trust-anchor root-ca.pem \
--tls-server-name-mode
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_sign-0.1.0.tar.gz.
File metadata
- Download URL: llm_sign-0.1.0.tar.gz
- Upload date:
- Size: 43.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fdbd6b62668129f352e1d71281e1f5547f85577020cd22b56d122403b20c75a0
|
|
| MD5 |
5e4189c2888c7f242c252c914acfcf18
|
|
| BLAKE2b-256 |
dfcd7380ac8ceb0a6e53d5eb4fbdca4f1be723a437ee604cd768c476d3de8c75
|
File details
Details for the file llm_sign-0.1.0-py3-none-any.whl.
File metadata
- Download URL: llm_sign-0.1.0-py3-none-any.whl
- Upload date:
- Size: 39.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0a1b7b16fa9970c042bc4bc2e363e09c8ced53f2f7b4f9ae579b0d7a56bf4485
|
|
| MD5 |
419e7756317effd4011b8cab45a7f750
|
|
| BLAKE2b-256 |
883abd0179bded3c874a1ff7d13f48d869f1c86dafd5fd041fd75022fef7ae8d
|