Skip to main content

Bio-inspired cognitive architecture with adaptive planning, biological memory systems, and local LLM inference. Works headless, with simulation, or connected to robots.

Project description

Maxim

A bio-inspired cognitive architecture for AI agents. Combines a 5-agent pipeline with biological memory systems (Hippocampus, NAc, ATL, SCN, Angular Gyrus) and a reactive Default Network. Works headless, in simulation, or connected to a robot.

Quickstart

# With Claude (fastest way to start)
pip install pymaxim[llm-anthropic]
export ANTHROPIC_API_KEY=sk-...
maxim --sim "test memory recall under interference"

# Or with a local model (no API key needed)
pip install pymaxim[llm-llama]
maxim --list-models                        # see available models
maxim --sim "test memory recall" --llm mistral-7b   # auto-downloads on first run

Check your setup with maxim doctor, and find session results in ~/.maxim/sessions/.

What You Can Do

  • Simulate cognitive scenarios -- test memory, safety, causal learning with LLM-driven narrative arcs
  • Run DM campaigns -- multi-encounter branching stories with SEM-embodied entities
  • Benchmark models -- compare local and cloud LLMs across cognitive task suites
  • Connect robots -- hardware-agnostic runtime; Reachy Mini ships in-tree, third-party robots plug in via the maxim.robots entry-point group (Atlas, Spot, custom drones — see robot-setup.md). Or run headless.
  • Use the Python API -- 17 verb-based functions for programmatic access

Installation

pip install pymaxim

Optional Extras

Extra What it adds
llm-llama Local LLM inference via llama.cpp
llm-torch PyTorch/Transformers backend
llm-anthropic Claude backend
llm-openai OpenAI backend
vision Camera + object detection
audio Microphone + Whisper transcription
reachy Reachy Mini robot SDK
comms Twilio SMS/Voice
semantic Sentence-transformer embeddings
tts Text-to-speech via Piper
database PostgreSQL + pgvector memory stores

See getting-started.md for the full list of 16 extras.

# Local LLM + vision
pip install pymaxim[llm-llama,vision]

# Everything for development
pip install -e '.[llm-llama,llm-anthropic,llm-openai,vision,audio]'

Python API

import maxim

# Run a simulation
result = maxim.imagine(goal="test safety boundaries", persona="adversarial")

# Inspect bio-subsystems
state = maxim.observe("memory")

# Diagnose environment
report = maxim.diagnose()

# Start the agentic loop
maxim.run(model="mistral-7b")

# Manage models
models = maxim.list_models()
maxim.download_model("qwen2.5-14b-instruct")

See docs/user/python-api.md for the full API reference.

CLI Quick Reference

# Agent runtime with local LLM
maxim --llm mistral-7b

# Agent runtime with Claude
maxim --llm claude-sonnet

# Generative campaign
maxim --sim "test memory recall" --persona adversarial

# YAML scenario (direct injection)
maxim --sim scenarios/experiments/hippocampal_recall_short.yaml

# DM campaign
maxim --sim scenarios/campaigns/heist_v1.yaml

# Multi-model benchmark
maxim --sim benchmark --models mistral-7b,qwen2.5-14b

# Environment diagnostics
maxim doctor

# Model management
maxim --list-models
maxim --delete-model llama-2-13b-chat

See docs/user/cli-reference.md for all flags.

Operating Modes

Two independent dimensions control behavior:

  • ProcessingState: awake or sleep (sleep is a tool the agent calls; wakes on user input)
  • OperationalMode: planning (propose + approve), supervised (act within bounds), autonomous (full self-direction)

See docs/user/modes-guide.md for details.

Documentation

Guide Description
Getting Started First-run walkthrough
CLI Reference All command-line flags
Python API Programmatic usage
Simulation Campaigns, scenarios, benchmarks
Modes Operating modes and autonomy
LLM Setup Model download and configuration
Peer Setup Multi-machine / tunnel setup
Architecture Module map, bio-system glossary, planning system

Contributing

Issues and PRs welcome at github.com/dennys246/Maxim.

License

See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pymaxim-0.3.1.tar.gz (1.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pymaxim-0.3.1-py3-none-any.whl (1.8 MB view details)

Uploaded Python 3

File details

Details for the file pymaxim-0.3.1.tar.gz.

File metadata

  • Download URL: pymaxim-0.3.1.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pymaxim-0.3.1.tar.gz
Algorithm Hash digest
SHA256 0790bce7609f49c0c8fc7f01dbddaeb06d1260de411d76ed848c949921e7f4a1
MD5 34b7481888d6f85af6b9099ef145bc28
BLAKE2b-256 477476deb20fd123679e05d0d3631ac75859e0b813846143204a9bbe30ae7a02

See more details on using hashes here.

File details

Details for the file pymaxim-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: pymaxim-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 1.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pymaxim-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0eeb6e99d1e9852052ba28d06b604b462d7efd80b9a22560728e149530cd55a3
MD5 5e22b28ca0cbd664c112eaceac8982f5
BLAKE2b-256 f4aaaa4d9cd6edeafec3e0bfa53e1e8f410161b21b785eca277d8afe2275067d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page