Skip to main content

Layered context container with TIBET provenance and JIS capability gating

Project description

tibet-context

Layered context container with TIBET provenance and JIS capability gating.

"Audit is context. Context is key. TIBET is the answer."

PyPI version License: MIT

The Blu-ray Model

Inspired by Blu-ray disk architecture: same data, different access levels. A JIS capability gate — like AACS keys — determines which model can read which layer.

The same TIBET chain serves two consumers:

  • Human / regulator: audit trail (compliance, evidence)
  • AI model: context window (memory, reasoning)
┌─────────────────────────────────────────────┐
│              tibet-context                   │
│  ┌───────────────────────────────────────┐  │
│  │  L0: Summary Layer (always readable)  │  │
│  │  - Compact summary (~512 tokens)      │  │
│  │  - Any model can read this (3B+)      │  │
│  ├───────────────────────────────────────┤  │
│  │  L1: Conversation Layer               │  │
│  │  - Full conversation context          │  │
│  │  - Requires: JIS capability >= 14B    │  │
│  ├───────────────────────────────────────┤  │
│  │  L2: Deep Context Layer               │  │
│  │  - Full codebase + cross-session mem  │  │
│  │  - Requires: JIS capability >= 32B    │  │
│  ├───────────────────────────────────────┤  │
│  │  TIBET Chain (through all layers)     │  │
│  │  - Provenance trail + integrity       │  │
│  └───────────────────────────────────────┘  │
│  JIS Capability Gate                        │
└─────────────────────────────────────────────┘

Installation

pip install tibet-context

Requires Python 3.10+ and tibet-core >= 0.3.0 (installed automatically). Zero other dependencies.

Quick Start

from tibet_context import ContextBuilder, ContextReader, CapabilityGate

# Build a layered context from conversation
builder = ContextBuilder()
container = builder.from_conversation(
    messages=[
        {"role": "user", "content": "How do I make pasta carbonara?"},
        {"role": "assistant", "content": "Cook pasta, put egg on it."},
    ],
    deep_context="Carbonara requires guanciale, pecorino, egg yolks, black pepper...",
)

# Read with capability filtering
reader = ContextReader()
reader.read(container, model_id="qwen2.5:32b")  # All 3 layers
reader.read(container, model_id="qwen2.5:3b")   # Only L0 summary

# Carbonara test — can this model handle deep context?
gate = CapabilityGate()
gate.carbonara_test("qwen2.5:3b")   # False — zakjapanner!
gate.carbonara_test("qwen2.5:32b")  # True

v0.1.0 — Core Engine

The core protocol: layered containers, capability gating, binary serialization, and TIBET provenance. Everything needed to prove the concept works.

Modules

Module Purpose
layers.py Layer, LayerSpec, CapabilityProfile — configurable layer definitions
container.py ContextContainer — the core layered context unit
gate.py CapabilityGate — JIS capability gate + carbonara test
builder.py ContextBuilder — build from conversations, chains, or merge containers
reader.py ContextReader — capability-filtered reading
compactor.py Compactor — intelligent context compaction per layer
serializer.py JSON + binary .tctx format with integrity verification

Capability Profiles

Profiles are fully configurable — no hardcoded thresholds. The default is tuned for the Qwen family:

# tibet-context.toml
[profile]
name = "qwen"

[profile.layers.0]
min_capability = 3    # Qwen 3B can read L0
max_tokens = 512

[profile.layers.1]
min_capability = 14   # Qwen 14B for L1
max_tokens = 4096

[profile.layers.2]
min_capability = 32   # Qwen 32B for L2
max_tokens = 16384

Create your own profile for any model family:

from tibet_context import CapabilityProfile

# Load from file
profile = CapabilityProfile.from_file("my-profile.toml")

# Or build programmatically
from tibet_context.layers import LayerSpec
profile = CapabilityProfile(name="llama", layers={
    0: LayerSpec(level=0, min_capability=1, max_tokens=512),
    1: LayerSpec(level=1, min_capability=8, max_tokens=4096),
    2: LayerSpec(level=2, min_capability=70, max_tokens=16384),
})

Binary .tctx Format

Compact binary format for efficient storage and transport:

from tibet_context import serializer

# Write
serializer.to_tctx_file(container, "context.tctx")

# Read
restored = serializer.from_tctx_file("context.tctx")
assert restored.verify_integrity()

Format: TCTX magic header, version, layers with content hashes, TCTX footer verification.

The Carbonara Test

The "zakjapanner" problem: a small model that gives superficially correct but actually wrong answers — like putting raw egg on pasta and calling it carbonara.

gate = CapabilityGate()

# Small model: can only see the summary
gate.carbonara_test("qwen2.5:3b")   # False — needs escalation
gate.carbonara_test("qwen2.5:7b")   # False

# Large model: can see deep context with the real technique
gate.carbonara_test("qwen2.5:32b")  # True — can handle it
gate.carbonara_test("qwen2.5:72b")  # True

CLI

# Version
python -m tibet_context --version

# Package info + default profile
python -m tibet_context info

# Run the carbonara demo
python -m tibet_context demo

# Read a container file
python -m tibet_context read context.tctx --model qwen2.5:32b

# Show a capability profile
python -m tibet_context profile --file my-profile.toml

Relation to TIBET Ecosystem

tibet-core (Token, Chain, Provider, FileStore)
    │ provides provenance
tibet-context (Container, Layers, Gate, Builder)
    │ feeds context to           │ hooks into
OomLlama (.oom chunks)     KmBiT (orchestration)
    │ runs on
P520 GPU (Qwen 3B/7B/32B)

tibet-context is the glue between audit (tibet) and AI (oomllama/kmbit). It transforms audit trail into actionable context.

Roadmap

  • v0.1.0 — Core Engine (current) — protocol + gating + serialization
  • v0.2.0 — Integration Layer — KmBiT orchestration, OomLlama memory bridge, tibet-core Provider hooks

License

MIT — Humotica

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tibet_context-0.1.0.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tibet_context-0.1.0-py3-none-any.whl (19.7 kB view details)

Uploaded Python 3

File details

Details for the file tibet_context-0.1.0.tar.gz.

File metadata

  • Download URL: tibet_context-0.1.0.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for tibet_context-0.1.0.tar.gz
Algorithm Hash digest
SHA256 b097332d9898585298b4eb59ae6b03490e8bfe7995f2ca30bb2d27fa9e2e9c4b
MD5 9a4258d761051b627065f8c4a17807ab
BLAKE2b-256 803080ce84c38364b25e675144423d82afab1a811a97cf7fd29efcf00888542c

See more details on using hashes here.

File details

Details for the file tibet_context-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: tibet_context-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 19.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for tibet_context-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 105eb49806a64a21b52047b6fc3ac93a773eeb62327f0a70d07470414b87c5f7
MD5 6263748b82982fe09181ecbc184e418c
BLAKE2b-256 e178a2f4b27ed3446336607bfc5bf45fb8b4a70610f683abfc062b8132d217f8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page