Skip to main content

Optimal context selection for AI coding assistants using QUBO/Ising formulation

Project description

Anneal

Optimal context selection for AI coding assistants. Cooling random context down to exactly what your AI needs.

Anneal reads your codebase's structural graph (from Graphify or code-review-graph), formulates "which chunks are optimal?" as a QUBO problem, solves with simulated annealing, and returns the minimum context set for your task.

How It Works

  1. Reads codebase graph (Graphify graph.json or code-review-graph SQLite)
  2. Generates candidate chunks (keyword matching + graph topology)
  3. Formulates QUBO: minimize token cost, maximize relevance, reward dependency coverage
  4. Solves via simulated annealing (SpinChain engine)
  5. Returns stability-ranked, dependency-ordered chunks within your token budget

Requirements

Installation

pip install anneal-context

Or with uv:

uv tool install anneal-context

Setup

1. Install a graph tool (required):

# code-review-graph
npx code-review-graph install

# Graphify (Claude Code)
/plugin marketplace add safishamsi/graphify && /graphify

2. Create .anneal/config.toml in your project root:

[budget]
default_tokens = 5000
strategy = "balanced"   # "minimal" | "balanced" | "thorough"

[solver]
backend = "simulated-annealing"
num_reads = 100
num_sweeps = 1000

Add .anneal/ to your .gitignore.

MCP Server Setup

Claude Code

Add to .claude/settings.json:

{
  "mcpServers": {
    "anneal": {
      "command": "anneal-server"
    }
  }
}

Gemini CLI

Add to ~/.gemini/settings.json:

{
  "mcpServers": {
    "anneal": {
      "command": "anneal-server"
    }
  }
}

OpenAI Codex CLI

Add to ~/.codex/config.toml:

[[mcp_servers]]
name = "anneal"
command = "anneal-server"

Cursor / VS Code + Copilot / Aider

Any MCP-compatible client: run anneal-server via stdio transport.

Tools

get_optimal_context

Parameters:
  task_description: str     -- what you want to do
  token_budget: int | None  -- max tokens (default: 5000)
  include_files: list[str]  -- always include these paths
  exclude_files: list[str]  -- never include these paths
  strategy: str             -- "balanced" | "minimal" | "thorough"

Returns:
  selected_chunks: list[{path, content, relevance_score, tokens}]
  total_tokens: int
  budget_utilization: float
  stability_score: float
  dependency_graph: dict

get_status

Returns graph source availability, node counts, solver config.

Development

git clone https://github.com/ameyakhot/anneal
cd anneal
uv venv && source .venv/bin/activate
uv pip install -e /path/to/spinchain
uv pip install -e ".[dev]"
python -m pytest tests/ -v

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anneal_context-0.1.0.tar.gz (130.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anneal_context-0.1.0-py3-none-any.whl (17.0 kB view details)

Uploaded Python 3

File details

Details for the file anneal_context-0.1.0.tar.gz.

File metadata

  • Download URL: anneal_context-0.1.0.tar.gz
  • Upload date:
  • Size: 130.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anneal_context-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c96256b02a6018d6e5f46c39f62da2375a5b2d13371697c86f5b1ddf59e1413c
MD5 95aea617c3569c53d6aa449668d11d47
BLAKE2b-256 06501d3016e48af64d8221b4dc2891ed16d22bc6912ef2d87f6aa909e5a3a4b2

See more details on using hashes here.

Provenance

The following attestation bundles were made for anneal_context-0.1.0.tar.gz:

Publisher: publish.yml on ameyakhot/anneal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file anneal_context-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: anneal_context-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 17.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anneal_context-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5c59b86e8f5e8f5aa1572ac6e40c529057ed0aba2bc84e09008edcbe47bbed62
MD5 4230311a5479df4119d1d031994407cf
BLAKE2b-256 4b0e9c76f674856ed76fbec2f59390dbc54607d7c41f846af7a2655c46be123f

See more details on using hashes here.

Provenance

The following attestation bundles were made for anneal_context-0.1.0-py3-none-any.whl:

Publisher: publish.yml on ameyakhot/anneal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page