Skip to main content

Bio-inspired cognitive architecture with adaptive planning, biological memory systems, and local LLM inference. Works headless, with simulation, or connected to robots.

Project description

Maxim

A bio-inspired cognitive architecture for AI agents. Combines a 5-agent pipeline with biological memory systems (Hippocampus, NAc, ATL, SCN, Angular Gyrus) and a reactive Default Network. Works headless, in simulation, or connected to a robot.

Quickstart

# With Claude (fastest way to start)
pip install pymaxim[llm-anthropic]
export ANTHROPIC_API_KEY=sk-...
maxim --sim "test memory recall under interference"

# Or with a local model (no API key needed)
pip install pymaxim[llm-llama]
maxim --list-models                        # see available models
maxim --sim "test memory recall" --llm mistral-7b   # auto-downloads on first run

Check your setup with maxim doctor, and find session results in ~/.maxim/sessions/.

What You Can Do

  • Simulate cognitive scenarios -- test memory, safety, causal learning with LLM-driven narrative arcs
  • Run DM campaigns -- multi-encounter branching stories with SEM-embodied entities
  • Benchmark models -- compare local and cloud LLMs across cognitive task suites
  • Connect robots -- hardware-agnostic runtime; Reachy Mini ships in-tree, third-party robots plug in via the maxim.robots entry-point group (Atlas, Spot, custom drones — see robot-setup.md). Or run headless.
  • Use the Python API -- 17 verb-based functions for programmatic access

Installation

pip install pymaxim

Optional Extras

Extra What it adds
llm-llama Local LLM inference via llama.cpp
llm-torch PyTorch/Transformers backend
llm-anthropic Claude backend
llm-openai OpenAI backend
vision Camera + object detection
audio Microphone + Whisper transcription
reachy Reachy Mini robot SDK
comms Twilio SMS/Voice
semantic Sentence-transformer embeddings
tts Text-to-speech via Piper
database PostgreSQL + pgvector memory stores

See getting-started.md for the full list of 16 extras.

# Local LLM + vision
pip install pymaxim[llm-llama,vision]

# Everything for development
pip install -e '.[llm-llama,llm-anthropic,llm-openai,vision,audio]'

Python API

import maxim

# Run a simulation
result = maxim.imagine(goal="test safety boundaries", persona="adversarial")

# Inspect bio-subsystems
state = maxim.observe("memory")

# Diagnose environment
report = maxim.diagnose()

# Start the agentic loop
maxim.run(model="mistral-7b")

# Manage models
models = maxim.list_models()
maxim.download_model("qwen2.5-14b-instruct")

See docs/user/python-api.md for the full API reference.

CLI Quick Reference

# Agent runtime with local LLM
maxim --llm mistral-7b

# Agent runtime with Claude
maxim --llm claude-sonnet

# Generative campaign
maxim --sim "test memory recall" --persona adversarial

# YAML scenario (direct injection)
maxim --sim scenarios/experiments/hippocampal_recall_short.yaml

# DM campaign
maxim --sim scenarios/campaigns/heist_v1.yaml

# Multi-model benchmark
maxim --sim benchmark --models mistral-7b,qwen2.5-14b

# Environment diagnostics
maxim doctor

# Model management
maxim --list-models
maxim --delete-model llama-2-13b-chat

See docs/user/cli-reference.md for all flags.

Operating Modes

Two independent dimensions control behavior:

  • ProcessingState: awake or sleep (sleep is a tool the agent calls; wakes on user input)
  • OperationalMode: planning (propose + approve), supervised (act within bounds), autonomous (full self-direction)

See docs/user/modes-guide.md for details.

Documentation

Guide Description
Getting Started First-run walkthrough
CLI Reference All command-line flags
Python API Programmatic usage
Simulation Campaigns, scenarios, benchmarks
Modes Operating modes and autonomy
LLM Setup Model download and configuration
Peer Setup Multi-machine / tunnel setup
Architecture Module map, bio-system glossary, planning system

Contributing

Issues and PRs welcome at github.com/dennys246/Maxim.

License

See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pymaxim-0.3.2.tar.gz (1.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pymaxim-0.3.2-py3-none-any.whl (1.8 MB view details)

Uploaded Python 3

File details

Details for the file pymaxim-0.3.2.tar.gz.

File metadata

  • Download URL: pymaxim-0.3.2.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pymaxim-0.3.2.tar.gz
Algorithm Hash digest
SHA256 aa5cb6be8d41e7b4cba5dc626d67df54c456a17b603a5fa037114ee451eca5e2
MD5 d33327a737f1d9abc10fbd04d8e30e8d
BLAKE2b-256 0826e83a4bbfd6930e70b5a6b5dab2ebac92cee9bf4c279850d86d72d195c7d9

See more details on using hashes here.

File details

Details for the file pymaxim-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: pymaxim-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 1.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pymaxim-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ea66d670301840fedf0cbaafc992127912761b18c6e9be67ddc9b5e2a4b32f9f
MD5 1f9c7c44c2f148294b06e95f1ffb834c
BLAKE2b-256 8fa3ebdee1432189f46dc1073380ab7e3683979a2670a72df8d0c39b7aa6963b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page