Skip to main content

Optimal context selection for AI coding assistants using QUBO/Ising formulation

Project description

Anneal

Optimal context selection for AI coding assistants. Cooling random context down to exactly what your AI needs.

Anneal reads your codebase's structural graph (from Graphify or code-review-graph), formulates "which chunks are optimal?" as a QUBO problem, solves with simulated annealing, and returns the minimum context set for your task.

How It Works

  1. Reads codebase graph (Graphify graph.json or code-review-graph SQLite)
  2. Generates candidate chunks (keyword matching + graph topology)
  3. Formulates QUBO: minimize token cost, maximize relevance, reward dependency coverage
  4. Solves via simulated annealing (SpinChain engine)
  5. Returns stability-ranked, dependency-ordered chunks within your token budget

Requirements

Installation

pip install anneal-context

Or with uv:

uv tool install anneal-context

Setup

1. Install a graph tool (required):

# code-review-graph
npx code-review-graph install

# Graphify (Claude Code)
/plugin marketplace add safishamsi/graphify && /graphify

2. Create .anneal/config.toml in your project root:

[budget]
default_tokens = 5000
strategy = "balanced"   # "minimal" | "balanced" | "thorough"

[solver]
backend = "simulated-annealing"
num_reads = 100
num_sweeps = 1000

Add .anneal/ to your .gitignore.

MCP Server Setup

Claude Code

Add to .claude/settings.json:

{
  "mcpServers": {
    "anneal": {
      "command": "anneal-server"
    }
  }
}

Gemini CLI

Add to ~/.gemini/settings.json:

{
  "mcpServers": {
    "anneal": {
      "command": "anneal-server"
    }
  }
}

OpenAI Codex CLI

Add to ~/.codex/config.toml:

[[mcp_servers]]
name = "anneal"
command = "anneal-server"

Cursor / VS Code + Copilot / Aider

Any MCP-compatible client: run anneal-server via stdio transport.

Tools

get_optimal_context

Parameters:
  task_description: str     -- what you want to do
  token_budget: int | None  -- max tokens (default: 5000)
  include_files: list[str]  -- always include these paths
  exclude_files: list[str]  -- never include these paths
  strategy: str             -- "balanced" | "minimal" | "thorough"

Returns:
  selected_chunks: list[{path, content, relevance_score, tokens}]
  total_tokens: int
  budget_utilization: float
  stability_score: float
  dependency_graph: dict

get_status

Returns graph source availability, node counts, solver config.

Development

git clone https://github.com/ameyakhot/anneal
cd anneal
uv venv && source .venv/bin/activate
uv pip install -e /path/to/spinchain
uv pip install -e ".[dev]"
python -m pytest tests/ -v

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anneal_context-0.1.3.tar.gz (131.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anneal_context-0.1.3-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file anneal_context-0.1.3.tar.gz.

File metadata

  • Download URL: anneal_context-0.1.3.tar.gz
  • Upload date:
  • Size: 131.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anneal_context-0.1.3.tar.gz
Algorithm Hash digest
SHA256 a88773c00c9149d2827cb2d603c4922cdf89ca8d2345ace073ece2cc90fec6fd
MD5 2a99fdcd5d30bdd7888d1aa912946b40
BLAKE2b-256 4ffdd5b0f151266ec811c98436b2b9a011629717a148f1c0b7449c26a56c2e8b

See more details on using hashes here.

Provenance

The following attestation bundles were made for anneal_context-0.1.3.tar.gz:

Publisher: publish.yml on ameyakhot/anneal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file anneal_context-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: anneal_context-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 17.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anneal_context-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5e243bff1ab53f1ec7fc3a8ed5c25dce0bd485476a60e2777509af093b114adb
MD5 a037dc5836de04cf2048ff541323c804
BLAKE2b-256 a35c98d164eba4274eb1aea5923c6992471e30c8756956a9ba1cbc4bd9953e4e

See more details on using hashes here.

Provenance

The following attestation bundles were made for anneal_context-0.1.3-py3-none-any.whl:

Publisher: publish.yml on ameyakhot/anneal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page