Bio-inspired cognitive architecture with adaptive planning, biological memory systems, and local LLM inference. Works headless, with simulation, or connected to robots.
Project description
Maxim
A bio-inspired cognitive architecture for AI agents. Combines a 5-agent pipeline with biological memory systems (Hippocampus, NAc, ATL, SCN, Angular Gyrus) and a reactive Default Network. Works headless, in simulation, or connected to a robot.
Quickstart
# With Claude (fastest way to start)
pip install pymaxim[llm-anthropic]
export ANTHROPIC_API_KEY=sk-...
maxim --sim "test memory recall under interference"
# Or with a local model (no API key needed)
pip install pymaxim[llm-llama]
maxim --list-models # see available models
maxim --sim "test memory recall" --llm mistral-7b # auto-downloads on first run
Check your setup with maxim doctor, and find session results in ~/.maxim/sessions/.
What You Can Do
- Simulate cognitive scenarios -- test memory, safety, causal learning with LLM-driven narrative arcs
- Run DM campaigns -- multi-encounter branching stories with SEM-embodied entities
- Benchmark models -- compare local and cloud LLMs across cognitive task suites
- Connect robots -- hardware-agnostic runtime; Reachy Mini ships in-tree, third-party robots plug in via the
maxim.robotsentry-point group (Atlas, Spot, custom drones — see robot-setup.md). Or run headless. - Use the Python API -- 17 verb-based functions for programmatic access
Installation
pip install pymaxim
Optional Extras
| Extra | What it adds |
|---|---|
llm-llama |
Local LLM inference via llama.cpp |
llm-torch |
PyTorch/Transformers backend |
llm-anthropic |
Claude backend |
llm-openai |
OpenAI backend |
vision |
Camera + object detection |
audio |
Microphone + Whisper transcription |
reachy |
Reachy Mini robot SDK |
comms |
Twilio SMS/Voice |
semantic |
Sentence-transformer embeddings |
tts |
Text-to-speech via Piper |
database |
PostgreSQL + pgvector memory stores |
See getting-started.md for the full list of 16 extras.
# Local LLM + vision
pip install pymaxim[llm-llama,vision]
# Everything for development
pip install -e '.[llm-llama,llm-anthropic,llm-openai,vision,audio]'
Python API
import maxim
# Run a simulation
result = maxim.imagine(goal="test safety boundaries", persona="adversarial")
# Inspect bio-subsystems
state = maxim.observe("memory")
# Diagnose environment
report = maxim.diagnose()
# Start the agentic loop
maxim.run(model="mistral-7b")
# Manage models
models = maxim.list_models()
maxim.download_model("qwen2.5-14b-instruct")
See docs/user/python-api.md for the full API reference.
CLI Quick Reference
# Agent runtime with local LLM
maxim --llm mistral-7b
# Agent runtime with Claude
maxim --llm claude-sonnet
# Generative campaign
maxim --sim "test memory recall" --persona adversarial
# YAML scenario (direct injection)
maxim --sim scenarios/experiments/hippocampal_recall_short.yaml
# DM campaign
maxim --sim scenarios/campaigns/heist_v1.yaml
# Multi-model benchmark
maxim --sim benchmark --models mistral-7b,qwen2.5-14b
# Environment diagnostics
maxim doctor
# Model management
maxim --list-models
maxim --delete-model llama-2-13b-chat
See docs/user/cli-reference.md for all flags.
Operating Modes
Two independent dimensions control behavior:
- ProcessingState:
awakeorsleep(sleep is a tool the agent calls; wakes on user input) - OperationalMode:
planning(propose + approve),supervised(act within bounds),autonomous(full self-direction)
See docs/user/modes-guide.md for details.
Documentation
| Guide | Description |
|---|---|
| Getting Started | First-run walkthrough |
| CLI Reference | All command-line flags |
| Python API | Programmatic usage |
| Simulation | Campaigns, scenarios, benchmarks |
| Modes | Operating modes and autonomy |
| LLM Setup | Model download and configuration |
| Peer Setup | Multi-machine / tunnel setup |
| Architecture | Module map, bio-system glossary, planning system |
Contributing
Issues and PRs welcome at github.com/dennys246/Maxim.
License
See LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pymaxim-0.6.0.tar.gz.
File metadata
- Download URL: pymaxim-0.6.0.tar.gz
- Upload date:
- Size: 1.5 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e8929cd7b047841061b0a43497311a773495fbe8fa93c6a7eeb5877657b2fcd6
|
|
| MD5 |
79f6eea3447208d86b031cca3915e9e7
|
|
| BLAKE2b-256 |
ef40df22498a450d1eff9ae9f535e3f9cfa739fe8b9daf49a208517824ce0a41
|
File details
Details for the file pymaxim-0.6.0-py3-none-any.whl.
File metadata
- Download URL: pymaxim-0.6.0-py3-none-any.whl
- Upload date:
- Size: 1.8 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9e6c294c9be00ff6516d72de3af06ed8e60018c62c3ea7d184c83530fe01b293
|
|
| MD5 |
c2843c442d184534c89ab39a2a3e7646
|
|
| BLAKE2b-256 |
be283ba298e37e5b9b6848f18b83bd26bcacca1b1a37e6d50692ca88b734d9a7
|