Context Kernel for LLM APIs.
Project description
lmctx
Context Kernel for LLM APIs. Standardize what happens before and after every model call, while keeping execution in your own runtime.
- Before call:
adapter.plan(context, spec)builds provider-ready payloads and diagnostics - After call:
adapter.ingest(context, response, spec=...)normalizes output back intoContext - Boundary: lmctx never sends HTTP requests, executes tools, or orchestrates loops
Why lmctx
- Append-only, snapshot-friendly context model (
Context) with immutable-by-default updates - Unified part model (
Part) for text, images, files, tool calls/results, thinking, compaction - Loss-resistant round-trips for opaque provider payloads through
provider_rawand blob references - Pluggable blob storage (
InMemoryBlobStore,FileBlobStore, or customBlobStore) - Provider adapters + auto routing via
AutoAdapteron(provider, endpoint, api_version) - Explainable planning through
RequestPlan(included,excluded,warnings,errors) - Minimal dependencies (core package has no runtime deps; provider SDKs are optional extras)
Install
pip install lmctx
# provider extras (optional)
pip install 'lmctx[openai]'
pip install 'lmctx[anthropic]'
pip install 'lmctx[google]'
pip install 'lmctx[bedrock]'
pip install 'lmctx[all]'
5-Minute Integration
from openai import OpenAI
from lmctx import AutoAdapter, Context, RunSpec
from lmctx.spec import Instructions
# 1) Build conversation state
ctx = Context().user("What is the capital of France?")
# 2) Describe runtime call settings
spec = RunSpec(
provider="openai",
endpoint="responses.create",
model="gpt-4o-mini",
instructions=Instructions(system="You are concise and accurate."),
)
# 3) Build request payload with lmctx
router = AutoAdapter()
plan = router.plan(ctx, spec)
# 4) Execute with provider SDK in your own code
client = OpenAI()
response = client.responses.create(**plan.request)
# 5) Normalize response back into Context
ctx = router.ingest(ctx, response, spec=spec)
assistant = ctx.last(role="assistant")
if assistant:
print(assistant.parts[0].text)
Core Types
| Type | Role |
|---|---|
Context |
Append-only conversation log (messages, cursor, usage_log, blob_store) |
Part / Message |
Canonical content model shared across adapters |
RunSpec |
Call configuration (provider, endpoint, model, tools, schema, extras) |
RequestPlan |
Planned payload + diagnostics for observability and debugging |
BlobReference / BlobStore |
Out-of-line binary/opaque payload storage with integrity verification |
Built-in Adapters
| Adapter | RunSpec selector |
Typical SDK call |
|---|---|---|
OpenAIResponsesAdapter |
openai / responses.create |
client.responses.create(**plan.request) |
OpenAIResponsesCompactAdapter |
openai / responses.compact |
client.responses.compact(**plan.request) |
OpenAIChatCompletionsAdapter |
openai / chat.completions |
client.chat.completions.create(**plan.request) |
OpenAIImagesAdapter |
openai / images.generate |
client.images.generate(**plan.request) |
AnthropicMessagesAdapter |
anthropic / messages.create |
client.messages.create(**plan.request) |
GoogleGenAIAdapter |
google / models.generate_content |
client.models.generate_content(**plan.request) |
BedrockConverseAdapter |
bedrock / converse |
client.converse(**plan.request) |
Documentation
docs/README.md: doc map and recommended reading pathsdocs/architecture.md: boundaries, lifecycle, extension pointsdocs/data-model.md: concrete type contracts and invariantsdocs/api-reference.md: public API quick referencedocs/adapters.md: adapter matrix and provider caveatsdocs/examples.md: runnable examples and prerequisitesdocs/logs.md: log files and regeneration workflow
Examples
Scripts are in examples/:
- Core (no API keys):
quickstart.py,multimodal.py,blob_stores.py,tool_calling.py - OpenAI:
api_openai_responses.py,api_openai_compact.py,api_openai_chat.py,api_openai_images.py - Anthropic:
api_anthropic.py,api_anthropic_compact.py - Google:
api_google_genai.py,api_google_image_generation.py - Bedrock:
api_bedrock.py
Run one:
uv run python examples/quickstart.py
Recorded Logs
Example outputs can be stored locally under examples/logs/ (git-ignored by default).
See docs/logs.md for mapping and regeneration commands.
Development
See CONTRIBUTING.md for full guidelines.
uv sync --all-extras --dev
make check
Requirements
- Python
>=3.10,<3.15
License
Apache License 2.0. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lmctx-0.1.0.tar.gz.
File metadata
- Download URL: lmctx-0.1.0.tar.gz
- Upload date:
- Size: 4.9 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c836caac8ee3358517ff6299e6a76b63cc86774ddd30afcd5b947ab0d19b96e0
|
|
| MD5 |
1beb2ab2e438591a51189c5a89974446
|
|
| BLAKE2b-256 |
e7c2a840e9894babc9508fa4780d4b0212cfc6fa488a3ed079d5993a1227ba4c
|
File details
Details for the file lmctx-0.1.0-py3-none-any.whl.
File metadata
- Download URL: lmctx-0.1.0-py3-none-any.whl
- Upload date:
- Size: 57.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e13d5d80cf31c7e5b7f18d75ab8b4d7d951e59a128af0c447c01786973d568a0
|
|
| MD5 |
1460ff8060b9a2a6adfa8e9fb78aa1fb
|
|
| BLAKE2b-256 |
4f28dfef09266aedfbcd495eb5d9537ec75cd08c280dca3eed8ac8bcfe0d880b
|