Cognitive Load Manager — real-time metacognitive middleware for LLM agents
Project description
CLM — Cognitive Load Manager
Real-time metacognitive middleware for LLM agents. Detects when your agent is cognitively overloaded and intervenes before it hallucinates, drifts, or crashes.
Installation
# Lightweight install (numpy only, ~10MB)
pip install clm-plugin
# Full install with embedding support (~1.5GB)
pip install clm-plugin[embed]
Quickstart
from clm import CLM
clm = CLM(verbose=True)
# In your agent loop, add one line:
result = clm.observe_raw(llm_output)
if result.action == "interrupt":
# CLM detected cognitive overload
prompt = f"Clarification needed: {result.clarification}"
elif result.action == "patch":
# Use compressed context in next call
context = result.context
What it monitors
CLM tracks 4 cognitive signals:
- Branching — too many tasks in flight
- Repetition — agent going in circles
- Uncertainty — excessive hedging language
- Goal drift — wandering from original intent
Combines these into a CLM score (0–100) and acts:
| Zone | Score | Action |
|---|---|---|
| Green | 0–40 | Pass through |
| Amber | 40–70 | Compress task branches, patch context |
| Red | 70–100 | Full compression + goal re-anchor |
Configuration
from clm import CLM, CLMConfig
config = CLMConfig(
weights=[0.30, 0.25, 0.25, 0.20], # [branching, repetition, uncertainty, goal_distance]
green_max=40.0,
amber_max=70.0,
no_embed=False, # Auto-set to True if sentence-transformers not installed
)
clm = CLM(config, verbose=True)
Adapters
LangChain
from clm.adapters import CLMCallbackHandler
handler = CLMCallbackHandler(verbose=True)
agent.run("your task", callbacks=[handler])
Loop Decorator
from clm.adapters import CLMLoop
loop = CLMLoop(verbose=True)
@loop
def agent_step(prompt: str) -> str:
return openai_client.chat(prompt)
Observability
clm.get_score() # Current CLM score (0–100)
clm.get_zone() # "Green" | "Amber" | "Red"
clm.get_history() # Step-by-step intervention log
clm.summary() # Session statistics
Links
- Documentation: GitHub README
- Source: GitHub Repository
- Issues: Bug Reports
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file clm_plugin-0.1.3.tar.gz.
File metadata
- Download URL: clm_plugin-0.1.3.tar.gz
- Upload date:
- Size: 47.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e8bb987a0838a93bc94940352c5386f91e975c96e81aa13053cd8397813d98a3
|
|
| MD5 |
3b6ea6f709f9273740bea8bca04a0f9f
|
|
| BLAKE2b-256 |
4e82924f87fb2877c87f3514a7fa050af3335687e1d2a6a6d265fdf8eca6cd1b
|
File details
Details for the file clm_plugin-0.1.3-py3-none-any.whl.
File metadata
- Download URL: clm_plugin-0.1.3-py3-none-any.whl
- Upload date:
- Size: 32.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
515d8457c869e1a72f5d2bdafd66cd016175f44452b5e053548a12421af356f3
|
|
| MD5 |
acb46359d442686da1c790a00324746d
|
|
| BLAKE2b-256 |
29e30d0655c8ef19dba0d649eed0d3ffae05cda859c93fdc33cd4c4922fa17bf
|