NeuroLoop™ – EXG-aware AI agent (Python edition using aider/litellm)
Project description
neuroloop-py
NeuroLoop™ – EXG-aware AI agent (Python edition)
A Python port of neuroloop using litellm for multi-provider LLM access and prompt_toolkit for the interactive terminal UI.
What it does
neuroloop-py is an EXG-aware conversational AI agent that:
- Reads your brainwaves before every turn via
neuroskill status - Injects your live mental state into the system prompt so the AI responds with full awareness of how you actually feel — cognitively, emotionally, somatically
- Auto-labels notable moments (awe, grief, deep focus, moral clarity, etc.) as permanent EXG annotations
- Runs guided protocols (breathing, meditation, grounding, somatic scans, etc.) step by step with OS notifications and EXG timestamps
- Searches the web, reads URLs, and maintains persistent memory across sessions
- Loads skill reference docs on-demand based on what you're asking about
- Pre-warms the compare cache so session comparisons are instant when you ask
Architecture
neuroloop/
├── main.py Entry point — model selection, CLI args, asyncio.run()
├── agent.py NeuroloopAgent — main loop, before_agent_start hook, tool dispatch
├── memory.py ~/.neuroskill/memory.md — read/write persistent memory
├── prompts.py STATUS_PROMPT + build_system_prompt() + read_skill_index()
├── neuroskill/
│ ├── run.py run_neuroskill() — subprocess executor (npx neuroskill ...)
│ ├── signals.py detect_signals() — 40+ regex-based domain signal detectors
│ ├── context.py select_contextual_data() — parallel queries + skill injection
│ └── client.py SkillConnection — WebSocket live event listener
└── tools/
├── web_fetch.py web_fetch tool — URL → plain text
├── web_search.py web_search tool — DuckDuckGo Lite (no API key)
└── protocol.py run_protocol tool — timed step execution + EXG labelling
NEUROLOOP.md Capability index — always injected into the system prompt
METRICS.md Full EXG metrics reference — injected on metric questions
skills/ One SKILL.md per neuroskill domain — injected on-demand
neuroskill-data-reference/
neuroskill-labels/
neuroskill-protocols/
neuroskill-recipes/
neuroskill-search/
neuroskill-sessions/
neuroskill-sleep/
neuroskill-status/
neuroskill-streaming/
neuroskill-transport/
vs. the TypeScript original
| TypeScript (neuroloop) | Python (neuroloop-py) |
|---|---|
| pi coding agent framework | litellm + prompt_toolkit |
pi ExtensionAPI |
NeuroloopAgent class |
before_agent_start hook |
agent.before_agent_start() async method |
pi registerTool |
ALL_TOOLS list (OpenAI function schema) |
pi InteractiveMode TUI |
prompt_toolkit PromptSession + rich |
pi skill loader (skillsOverride) |
_load_skill() / _load_metrics_md() |
| pi model-based skill invocation | Signal-driven injection in context.py |
| WebSocket EXG live panel | Per-turn neuroskill status + WS events |
~/.neuroloop/ agent dir |
~/.neuroskill/ (shared with TypeScript) |
Installation
cd /agent/ns/neuroloop-py
pip install -e .
# or: uv sync
Requires Python ≥ 3.12.
Usage
# Interactive mode (auto-detects model)
neuroloop-py
# With a specific model
neuroloop-py --model claude-3-5-sonnet-20241022
neuroloop-py --model ollama/qwen3:5b
# With an initial message
neuroloop-py "How is my focus today?"
# Via python -m
python -m neuroloop --model gpt-4o "Summarise my last session"
Model selection (priority order)
--model MODELCLI flagNEUROLOOP_MODELenvironment variable- Auto-detect running Ollama → prefers
qwen3:5b, falls back to first available model - Cloud API keys:
ANTHROPIC_API_KEY→ claude,OPENAI_API_KEY→ gpt-4o,GEMINI_API_KEY→ gemini - Hard default:
ollama/qwen3:5b
Note: Ollama models don't support function calling — tools are disabled automatically. Cloud models (Anthropic, OpenAI, Gemini) get full tool access.
Keyboard shortcuts
| Key | Action |
|---|---|
ctrl-d |
Quit |
ctrl-c |
Cancel current LLM response (at prompt: press twice to quit) |
tab |
Autocomplete commands and /neuro subcommands |
Commands
| Command | Description |
|---|---|
/exg |
Show live EXG snapshot |
/exg on / /exg off |
Toggle EXG display |
/neuro <cmd> [args] |
Run any neuroskill subcommand |
/memory |
Show persistent memory |
/model [name] |
Show or switch model |
/help |
List all commands |
/quit |
Exit |
!cmd |
Run a shell command |
Tools available to the AI
Tools are only available to cloud models (Anthropic, OpenAI, Gemini). Ollama models receive the same context but respond in plain text without tool calls.
| Tool | Description |
|---|---|
web_fetch |
Fetch any URL → plain text |
web_search |
DuckDuckGo Lite search (no API key needed) |
memory_read |
Read ~/.neuroskill/memory.md |
memory_write |
Write / append to memory |
neuroskill_label |
Create a timestamped EXG annotation |
neuroskill_run |
Run any neuroskill subcommand |
prewarm |
Start background neuroskill compare cache build |
run_protocol |
Execute a timed multi-step guided protocol with EXG labels |
Skills & signal-driven injection
Every user prompt is scanned by detect_signals() (40+ regex patterns across 40+
domains). Matching signals trigger two things in parallel:
- Neuroskill queries —
neuroskill session,neuroskill sleep,search-labels …, etc. — run concurrently and appended to the system context - Skill file injection — the relevant
skills/*/SKILL.mdorMETRICS.mdis read and prepended to the context block
| Signal | Skill injected | Queries fired |
|---|---|---|
protocols |
neuroskill-protocols | — |
sleep |
neuroskill-sleep | sleep, search-labels sleep … |
sessions / compare |
neuroskill-sessions | sessions |
compare |
neuroskill-search | compare (cached) |
labels_api |
neuroskill-labels | — |
metrics_ref |
neuroskill-data-reference + METRICS.md | — |
transport |
neuroskill-transport | — |
streaming |
neuroskill-streaming | — |
scripting |
neuroskill-recipes | — |
session |
neuroskill-status | session 0 |
focus / stress / hrv / … |
— | session 0, search-labels … |
NEUROLOOP.md (capability overview) is always injected between the static guidance
and the live EXG context.
Requirements
- Python ≥ 3.12
neuroskillnpm package reachable vianpx neuroskill- At least one of:
- A running Ollama instance (no API key needed, no tools)
ANTHROPIC_API_KEY,OPENAI_API_KEY, orGEMINI_API_KEY(full tool support)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file neuroloop-0.0.1.tar.gz.
File metadata
- Download URL: neuroloop-0.0.1.tar.gz
- Upload date:
- Size: 85.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3a42b3b46049b74c1c7ac17e8074aaf20cb642c2eaf47ce4ccbd96d680c53a20
|
|
| MD5 |
c528f25ca379c08b7f0212103cd9632e
|
|
| BLAKE2b-256 |
6764de4978ee4d29e3b2caa47fc840f9bd555c59fcc10d5308f9e55d6184f640
|
File details
Details for the file neuroloop-0.0.1-py3-none-any.whl.
File metadata
- Download URL: neuroloop-0.0.1-py3-none-any.whl
- Upload date:
- Size: 59.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
873e3e6710136a7b228db12fcaca34dd2974080286dfbd699d395c4f287580ea
|
|
| MD5 |
465195e02ea14f0df4b63764c0f38bb2
|
|
| BLAKE2b-256 |
ee17f0979ed48395abf65980a23dcf254f2d6869d6d24e8e0b46753fca5e560c
|