Skip to main content

Cognitive Load Manager — real-time metacognitive middleware for LLM agents

Project description

CLM — Cognitive Load Manager

Real-time metacognitive middleware for LLM agents. Detects when your agent is cognitively overloaded and intervenes before it hallucinates, drifts, or crashes.

Installation

# Lightweight install (numpy only, ~10MB)
pip install clm-plugin

# Full install with embedding support (~1.5GB)
pip install clm-plugin[embed]

Quickstart

from clm import CLM

clm = CLM(verbose=True)

# In your agent loop, add one line:
result = clm.observe_raw(llm_output)

if result.action == "interrupt":
    # CLM detected cognitive overload
    prompt = f"Clarification needed: {result.clarification}"
elif result.action == "patch":
    # Use compressed context in next call
    context = result.context

What it monitors

CLM tracks 4 cognitive signals:

  • Branching — too many tasks in flight
  • Repetition — agent going in circles
  • Uncertainty — excessive hedging language
  • Goal drift — wandering from original intent

Combines these into a CLM score (0–100) and acts:

Zone Score Action
Green 0–40 Pass through
Amber 40–70 Compress task branches, patch context
Red 70–100 Full compression + goal re-anchor

Configuration

from clm import CLM, CLMConfig

config = CLMConfig(
    weights=[0.30, 0.25, 0.25, 0.20],  # [branching, repetition, uncertainty, goal_distance]
    green_max=40.0,
    amber_max=70.0,
    no_embed=False,  # Auto-set to True if sentence-transformers not installed
)

clm = CLM(config, verbose=True)

Adapters

LangChain

from clm.adapters import CLMCallbackHandler

handler = CLMCallbackHandler(verbose=True)
agent.run("your task", callbacks=[handler])

Loop Decorator

from clm.adapters import CLMLoop

loop = CLMLoop(verbose=True)

@loop
def agent_step(prompt: str) -> str:
    return openai_client.chat(prompt)

Observability

clm.get_score()       # Current CLM score (0–100)
clm.get_zone()        # "Green" | "Amber" | "Red"
clm.get_history()     # Step-by-step intervention log
clm.summary()         # Session statistics

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clm_plugin-0.1.1.tar.gz (46.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clm_plugin-0.1.1-py3-none-any.whl (31.8 kB view details)

Uploaded Python 3

File details

Details for the file clm_plugin-0.1.1.tar.gz.

File metadata

  • Download URL: clm_plugin-0.1.1.tar.gz
  • Upload date:
  • Size: 46.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.6

File hashes

Hashes for clm_plugin-0.1.1.tar.gz
Algorithm Hash digest
SHA256 1c528059b276082f663796781f71e89e89695ae1ea1476124ebc6179fa0bad0f
MD5 2c367e18f2267522651c0f031e70c326
BLAKE2b-256 4c872cd905d203569554e9b389ac382f5060037a532ddfef68f1c13e9e955859

See more details on using hashes here.

File details

Details for the file clm_plugin-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: clm_plugin-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 31.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.6

File hashes

Hashes for clm_plugin-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 30dd5e5495788a42bcc8b841f62f38034160f53e40b5e9e64a89be92a00122df
MD5 7b65415184efbddd80fa1020849eba6c
BLAKE2b-256 ff2d6afbcba3f8d45651e64524c79ae736f011b2641ee85a77152e75c113f38b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page