Skip to main content

Minimal autonomous agent harness with LangGraph Deep Agents

Project description

open-strix

PyPI version

Minimal, non-production autonomous agent harness built with LangGraph Deep Agents.

Install uv

Install uv first:

# macOS / Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows (PowerShell)
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Official install docs (alternate methods like Homebrew, pipx, winget):

Quick start

uv init --python 3.11
uv add open-strix
uv run open-strix

On first run, open-strix bootstraps the current directory with:

  • state/
  • skills/
  • blocks/
  • logs/events.jsonl
  • logs/journal.jsonl
  • scheduler.yaml
  • config.yaml
  • checkpoint.md

It connects to Discord if a token is present (by default DISCORD_TOKEN). Otherwise it runs in local stdin mode.

Environment setup

Start from the example env file:

cp .env.example .env

Default model setup in this project expects an Anthropic-compatible endpoint:

  • ANTHROPIC_API_KEY
  • ANTHROPIC_BASE_URL

Discord runtime uses:

  • DISCORD_TOKEN

Optional:

  • DISCORD_TEST_CHANNEL_ID
  • OPEN_STRIX_TEST_MODEL

Models

Default: MiniMax M2.5

This project defaults to:

  • model: MiniMax-M2.5 in config.yaml
  • provider prefix anthropic: internally (so the runtime uses anthropic:MiniMax-M2.5)

Use MiniMax's Anthropic-compatible endpoint in your .env:

  • ANTHROPIC_BASE_URL=https://api.minimax.io/anthropic

MiniMax docs:

Alternative: Kimi K2.5

If you want Kimi instead of MiniMax:

  1. Point Anthropic-compatible env vars at your Moonshot endpoint (see Moonshot docs for current endpoint details).
  2. Set model in config.yaml to the current Kimi model ID you want.

Moonshot docs:

Note: the Moonshot update posted on November 8, 2025 references kimi-k2-thinking and kimi-k2-thinking-turbo. If you refer to these as "K2.5", use the exact current model IDs from Moonshot docs/console.

Model config behavior

config.yaml key:

  • model

Behavior:

  • If model has no : (example MiniMax-M2.5), open-strix treats it as Anthropic-provider and uses anthropic:<model>.
  • If model already includes provider:model (example openai:gpt-4o-mini), it is passed through unchanged.

Discord setup

Use Discord's Developer Portal web UI:

  1. Go to https://discord.com/developers/applications and create a new application.
  2. Open your app, then go to the Bot tab.
  3. Under Token, generate/reset token and copy it (you won't be able to re-view it later).
  4. In the same Bot tab, enable Message Content Intent (required for open-strix message handling).
  5. Go to Installation.
  6. Under Installation Contexts, enable Guild Install.
  7. Under Install Link, pick Discord Provided Link.
  8. Under Default Install Settings:
    • Guild Install scopes: select bot (and applications.commands if you plan to add slash commands).
    • Permissions: for this bot, a practical baseline is:
      • View Channels
      • Send Messages
      • Send Messages in Threads
      • Read Message History
      • Add Reactions
  9. Copy the generated install link, open it in your browser, pick your server, and authorize.

Reference docs for the same flow:

Where this is configured in open-strix:

  • Token env var name: config.yaml -> discord_token_env (default DISCORD_TOKEN)
  • Actual token value: your .env
  • Bot allowlist behavior: config.yaml -> always_respond_bot_ids

config.yaml tour

Default:

model: MiniMax-M2.5
journal_entries_in_prompt: 90
discord_messages_in_prompt: 10
discord_token_env: DISCORD_TOKEN
always_respond_bot_ids: []

Key meanings:

  • model: model name (or provider:model)
  • journal_entries_in_prompt: how many journal entries go into each prompt
  • discord_messages_in_prompt: how many recent Discord messages go into each prompt
  • discord_token_env: env var name to read Discord token from
  • always_respond_bot_ids: bot author IDs the agent is allowed to respond to

Related files:

  • scheduler.yaml: cron/time-of-day jobs
  • blocks/*.yaml: memory blocks surfaced in prompt context
  • checkpoint.md: returned by journal tool after a journal write
  • skills/: user-editable local skills
  • /.open_strix_builtin_skills/skill-creator/SKILL.md: packaged built-in skill source mounted as read-only

Runtime behavior note:

  • Git sync (git add -A -> commit -> push) runs automatically after each processed turn.

Personality bootstrap

Creating an agent is less about code, and a whole lot more about the time you spend talking to it. Lily Luo has a great post on forming agent personalities.

You should plan on spending time:

  • Communication patterns — correct the agent to know when and how often it should use the send_message and react tools. Agents often initially find it surprising that their final message is ignored, so they need to use their tools instead.
  • Talk about things you're interested in, see what the agent becomes interested in

Tests

uv run pytest -q

Discord coverage includes:

  • unit tests with mocked boundaries in tests/test_discord.py
  • live integration tests against real Discord in tests/test_discord_live.py

Live test env vars:

  • DISCORD_TOKEN (required for live connect test)
  • DISCORD_TEST_CHANNEL_ID (optional; enables live send-message test)

Safety baseline

  • Agent file writes/edits are blocked outside state/.
  • Reads still use repository scope.
  • This is intentionally simple and should not be treated as production-ready.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_strix-0.1.7.tar.gz (170.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

open_strix-0.1.7-py3-none-any.whl (27.5 kB view details)

Uploaded Python 3

File details

Details for the file open_strix-0.1.7.tar.gz.

File metadata

  • Download URL: open_strix-0.1.7.tar.gz
  • Upload date:
  • Size: 170.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.4

File hashes

Hashes for open_strix-0.1.7.tar.gz
Algorithm Hash digest
SHA256 d9a0cf963b729cc9759fd7e99479422035033d54e485f1451e4e67526788c7dc
MD5 92b22776af6cc48ed36e7b7ee85f05e7
BLAKE2b-256 bac77a532a2182196b3d03093df195cd3a171abd759533b16f151b97e8701ed5

See more details on using hashes here.

File details

Details for the file open_strix-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: open_strix-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 27.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.4

File hashes

Hashes for open_strix-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 a8090dd145ea0c638ecb4d5cdf23cbc09ebbe6836bc893ce60e7cc53753148a0
MD5 57af87ed72d49d1620e07d1a074a463a
BLAKE2b-256 72fd0ca86607f89c58ab98a1f831e38b00684014572a4459f510b534cec167a6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page