Skip to main content

Bio-inspired cognitive architecture with adaptive planning, biological memory systems, and local LLM inference. Works headless, with simulation, or connected to robots.

Project description

Maxim

A bio-inspired cognitive architecture for AI agents. Combines a 5-agent pipeline with biological memory systems (Hippocampus, NAc, ATL, SCN, Angular Gyrus) and a reactive Default Network. Works headless, in simulation, or connected to a robot.

Quickstart

# With Claude (fastest way to start)
pip install pymaxim[llm-anthropic]
export ANTHROPIC_API_KEY=sk-...
maxim --sim "test memory recall under interference"

# Or with a local model (no API key needed)
pip install pymaxim[llm-llama]
maxim --list-models                        # see available models
maxim --sim "test memory recall" --llm mistral-7b   # auto-downloads on first run

Check your setup with maxim doctor, and find session results in ~/.maxim/sessions/.

What You Can Do

  • Simulate cognitive scenarios -- test memory, safety, causal learning with LLM-driven narrative arcs
  • Run DM campaigns -- multi-encounter branching stories with SEM-embodied entities
  • Benchmark models -- compare local and cloud LLMs across cognitive task suites
  • Connect robots -- hardware-agnostic runtime; Reachy Mini ships in-tree, third-party robots plug in via the maxim.robots entry-point group (Atlas, Spot, custom drones — see robot-setup.md). Or run headless.
  • Use the Python API -- 17 verb-based functions for programmatic access

Installation

pip install pymaxim

Optional Extras

Extra What it adds
llm-llama Local LLM inference via llama.cpp
llm-torch PyTorch/Transformers backend
llm-anthropic Claude backend
llm-openai OpenAI backend
vision Camera + object detection
audio Microphone + Whisper transcription
reachy Reachy Mini robot SDK
comms Twilio SMS/Voice
semantic Sentence-transformer embeddings
tts Text-to-speech via Piper
database PostgreSQL + pgvector memory stores

See getting-started.md for the full list of 16 extras.

# Local LLM + vision
pip install pymaxim[llm-llama,vision]

# Everything for development
pip install -e '.[llm-llama,llm-anthropic,llm-openai,vision,audio]'

Python API

import maxim

# Run a simulation
result = maxim.imagine(goal="test safety boundaries", persona="adversarial")

# Inspect bio-subsystems
state = maxim.observe("memory")

# Diagnose environment
report = maxim.diagnose()

# Start the agentic loop
maxim.run(model="mistral-7b")

# Manage models
models = maxim.list_models()
maxim.download_model("qwen2.5-14b-instruct")

See docs/user/python-api.md for the full API reference.

CLI Quick Reference

# Agent runtime with local LLM
maxim --llm mistral-7b

# Agent runtime with Claude
maxim --llm claude-sonnet

# Generative campaign
maxim --sim "test memory recall" --persona adversarial

# YAML scenario (direct injection)
maxim --sim scenarios/experiments/hippocampal_recall_short.yaml

# DM campaign
maxim --sim scenarios/campaigns/heist_v1.yaml

# Multi-model benchmark
maxim --sim benchmark --models mistral-7b,qwen2.5-14b

# Environment diagnostics
maxim doctor

# Model management
maxim --list-models
maxim --delete-model llama-2-13b-chat

See docs/user/cli-reference.md for all flags.

Operating Modes

Two independent dimensions control behavior:

  • ProcessingState: awake or sleep (sleep is a tool the agent calls; wakes on user input)
  • OperationalMode: planning (propose + approve), supervised (act within bounds), autonomous (full self-direction)

See docs/user/modes-guide.md for details.

Documentation

Guide Description
Getting Started First-run walkthrough
CLI Reference All command-line flags
Python API Programmatic usage
Simulation Campaigns, scenarios, benchmarks
Modes Operating modes and autonomy
LLM Setup Model download and configuration
Peer Setup Multi-machine / tunnel setup
Architecture Module map, bio-system glossary, planning system

Contributing

Issues and PRs welcome at github.com/dennys246/Maxim.

License

See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pymaxim-0.4.0.tar.gz (1.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pymaxim-0.4.0-py3-none-any.whl (1.8 MB view details)

Uploaded Python 3

File details

Details for the file pymaxim-0.4.0.tar.gz.

File metadata

  • Download URL: pymaxim-0.4.0.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pymaxim-0.4.0.tar.gz
Algorithm Hash digest
SHA256 8902fefff1e23192996bad05c2dda6f2bb2cac2da52241b7bf5d1691b86980eb
MD5 095ddf10b8679e7eb6578e65e703d2eb
BLAKE2b-256 56cd5caa8b61b9714acc09d78e7e7d3c428a42ce390d0d529aad30ad92658054

See more details on using hashes here.

File details

Details for the file pymaxim-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: pymaxim-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 1.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pymaxim-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7b276d18d469894964cfbaf05b0466cf1124e421ec4e5043da0ae7d0c1ceaec
MD5 ec380ff075b5b55dc7229ea0cab3bf05
BLAKE2b-256 8b22409ecd7f089cf9411e97c0f4e7f5259ebb0945e8cb228077491f6f8d2f15

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page