Skip to main content

Optimal context selection for AI coding assistants using QUBO/Ising formulation

Project description

Anneal

Optimal context selection for AI coding assistants. Cooling random context down to exactly what your AI needs.

Anneal reads your codebase's structural graph (from Graphify or code-review-graph), formulates "which chunks are optimal?" as a QUBO problem, solves with simulated annealing, and returns the minimum context set for your task.

How It Works

  1. Reads codebase graph (Graphify graph.json or code-review-graph SQLite)
  2. Generates candidate chunks (keyword matching + graph topology)
  3. Formulates QUBO: minimize token cost, maximize relevance, reward dependency coverage
  4. Solves via simulated annealing (SpinChain engine)
  5. Returns stability-ranked, dependency-ordered chunks within your token budget

Requirements

Installation

pip install anneal-context

Or with uv:

uv tool install anneal-context

Setup

1. Install a graph tool (required):

# code-review-graph
npx code-review-graph install

# Graphify (Claude Code)
/plugin marketplace add safishamsi/graphify && /graphify

2. Create .anneal/config.toml in your project root:

[budget]
default_tokens = 5000
strategy = "balanced"   # "minimal" | "balanced" | "thorough"

[solver]
backend = "simulated-annealing"
num_reads = 100
num_sweeps = 1000

Add .anneal/ to your .gitignore.

MCP Server Setup

Claude Code

Add to .claude/settings.json:

{
  "mcpServers": {
    "anneal": {
      "command": "anneal-server"
    }
  }
}

Gemini CLI

Add to ~/.gemini/settings.json:

{
  "mcpServers": {
    "anneal": {
      "command": "anneal-server"
    }
  }
}

OpenAI Codex CLI

Add to ~/.codex/config.toml:

[[mcp_servers]]
name = "anneal"
command = "anneal-server"

Cursor / VS Code + Copilot / Aider

Any MCP-compatible client: run anneal-server via stdio transport.

Tools

get_optimal_context

Parameters:
  task_description: str     -- what you want to do
  token_budget: int | None  -- max tokens (default: 5000)
  include_files: list[str]  -- always include these paths
  exclude_files: list[str]  -- never include these paths
  strategy: str             -- "balanced" | "minimal" | "thorough"

Returns:
  selected_chunks: list[{path, content, relevance_score, tokens}]
  total_tokens: int
  budget_utilization: float
  stability_score: float
  dependency_graph: dict

get_status

Returns graph source availability, node counts, solver config.

Development

git clone https://github.com/ameyakhot/anneal
cd anneal
uv venv && source .venv/bin/activate
uv pip install -e /path/to/spinchain
uv pip install -e ".[dev]"
python -m pytest tests/ -v

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anneal_context-0.1.2.tar.gz (130.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anneal_context-0.1.2-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file anneal_context-0.1.2.tar.gz.

File metadata

  • Download URL: anneal_context-0.1.2.tar.gz
  • Upload date:
  • Size: 130.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anneal_context-0.1.2.tar.gz
Algorithm Hash digest
SHA256 7c2dbe160fdf062a522eaa4e2f312984a867df2f65651074f2495812bdc4315b
MD5 da6c2ee477a4357da20fcbcd1d7b483e
BLAKE2b-256 ea2780843dc67fc7616a2dc370de5050f2c73a63f6f10ce80edeacd03114ef3a

See more details on using hashes here.

Provenance

The following attestation bundles were made for anneal_context-0.1.2.tar.gz:

Publisher: publish.yml on ameyakhot/anneal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file anneal_context-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: anneal_context-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 17.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anneal_context-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4fa7ca1ac0cc4882aa9674408f75fa5d78fb0b43ca11c0b55ac2cac6ee32d309
MD5 e362517a1154b980f4014d4f77644104
BLAKE2b-256 d5c91635cfb51715461efd431a08570bab480a0e4fcda9284a68babbe2b3fa94

See more details on using hashes here.

Provenance

The following attestation bundles were made for anneal_context-0.1.2-py3-none-any.whl:

Publisher: publish.yml on ameyakhot/anneal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page