Skip to main content

Cognitive Load Manager — real-time metacognitive middleware for LLM agents

Project description

CLM — Cognitive Load Manager

Real-time metacognitive middleware for LLM agents. Detects when your agent is cognitively overloaded and intervenes before it hallucinates, drifts, or crashes.

Installation

# Lightweight install (numpy only, ~10MB)
pip install clm-plugin

# Full install with embedding support (~1.5GB)
pip install clm-plugin[embed]

Quickstart

from clm import CLM

clm = CLM(verbose=True)

# In your agent loop, add one line:
result = clm.observe_raw(llm_output)

if result.action == "interrupt":
    # CLM detected cognitive overload
    prompt = f"Clarification needed: {result.clarification}"
elif result.action == "patch":
    # Use compressed context in next call
    context = result.context

What it monitors

CLM tracks 4 cognitive signals:

  • Branching — too many tasks in flight
  • Repetition — agent going in circles
  • Uncertainty — excessive hedging language
  • Goal drift — wandering from original intent

Combines these into a CLM score (0–100) and acts:

Zone Score Action
Green 0–40 Pass through
Amber 40–70 Compress task branches, patch context
Red 70–100 Full compression + goal re-anchor

Configuration

from clm import CLM, CLMConfig

config = CLMConfig(
    weights=[0.30, 0.25, 0.25, 0.20],  # [branching, repetition, uncertainty, goal_distance]
    green_max=40.0,
    amber_max=70.0,
    no_embed=False,  # Auto-set to True if sentence-transformers not installed
)

clm = CLM(config, verbose=True)

Adapters

LangChain

from clm.adapters import CLMCallbackHandler

handler = CLMCallbackHandler(verbose=True)
agent.run("your task", callbacks=[handler])

Loop Decorator

from clm.adapters import CLMLoop

loop = CLMLoop(verbose=True)

@loop
def agent_step(prompt: str) -> str:
    return openai_client.chat(prompt)

Observability

clm.get_score()       # Current CLM score (0–100)
clm.get_zone()        # "Green" | "Amber" | "Red"
clm.get_history()     # Step-by-step intervention log
clm.summary()         # Session statistics

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clm_plugin-0.1.2.tar.gz (47.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clm_plugin-0.1.2-py3-none-any.whl (32.7 kB view details)

Uploaded Python 3

File details

Details for the file clm_plugin-0.1.2.tar.gz.

File metadata

  • Download URL: clm_plugin-0.1.2.tar.gz
  • Upload date:
  • Size: 47.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.6

File hashes

Hashes for clm_plugin-0.1.2.tar.gz
Algorithm Hash digest
SHA256 e4cee61265d42b7af50d52e08087c9e1e7df1d0f8cd93b0e1363aa89989f764b
MD5 3816ca904477aac423eecbcb2af2eda5
BLAKE2b-256 39f668a5a80a9e58d4e4d69cb5e81eaf232b7be9dc365216caa3eab6cfaac8f1

See more details on using hashes here.

File details

Details for the file clm_plugin-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: clm_plugin-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 32.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.6

File hashes

Hashes for clm_plugin-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1b25409bd490922836363f598db4e9649748b1435cb975234afd85230f4ff9e6
MD5 bd53b0ef0cf1ea9d695f8f904a1a03e0
BLAKE2b-256 c8a9c145837d59ea40e1e37579f7bf7c2a311ea4b5890677bddb3083db6740d7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page