Skip to main content

Queryable concept map of a codebase for LLM coding agents

Project description

combfind

Give an AI agent a codebase. combfind tells it where to look.

combfind builds a local index of a repository so an agent can find the right files and functions for a task with a plain-text query, without reading the entire codebase.

Install

For local LLM inference:

pip3 install "combfind[llm]" \
  --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu

Download a model (one-time, ~2 GB):

combfind download-model

For a remote OpenAI-compatible API instead:

pip3 install "combfind[openai]"

Usage

# Index a repository (local LLM, auto-detected model)
combfind init /path/to/repo --db repo.db

# Index using a remote OpenAI-compatible API
COMBFIND_LLM_API_KEY=sk-... COMBFIND_LLM_MODEL=gpt-4o-mini \
  combfind init /path/to/repo --db repo.db --llm-mode openai

# Query it
combfind query "how does authentication work" --db repo.db
combfind query "where are database migrations" --db repo.db --format json

Query output (JSON)

[
  {
    "rank": 1,
    "concept": "Token Refresh",
    "role": "implementation",
    "score": 0.87,
    "files": [{"path": "auth/service.py", "start_line": 42, "end_line": 91}],
    "symbols": ["AuthService.refresh", "AuthService.validate"],
    "why_relevant": "Handles session token validation and refresh logic.",
    "sibling_implementations": []
  }
]

Init options

Flag Default Description
--db <repo_path>/.combfind.db Output path
--llm-model auto-detected Path to a GGUF model file (local mode only)
--llm-mode local LLM backend: local (llama.cpp) or openai (OpenAI-compatible API)
--exclude-paths - Paths to skip relative to repo root (repeatable)
--exclude-regex - Regex matched against file paths to skip

Query options

Flag Default Description
--db .combfind.db Database to query
--top-k 5 Number of results to return
--format text Output format: text or json

Environment variables

Variable Default Description
COMBFIND_LOG_LEVEL info Log verbosity: debug, info, warning, error
COMBFIND_LLM_BASE_URL - Base URL for OpenAI-compatible API (e.g. https://api.openai.com/v1)
COMBFIND_LLM_API_KEY - API key for the remote LLM
COMBFIND_LLM_MODEL gpt-4o-mini Model name to use with --llm-mode openai

Using a remote LLM API

Pass --llm-mode openai to use any OpenAI-compatible API instead of a local model. Configure it with environment variables:

export COMBFIND_LLM_BASE_URL=https://api.openai.com/v1
export COMBFIND_LLM_API_KEY=sk-...
export COMBFIND_LLM_MODEL=gpt-4o-mini

combfind init /path/to/repo --db repo.db --llm-mode openai

Any API that speaks the OpenAI chat completions format works, including:

  • OpenAI — set COMBFIND_LLM_BASE_URL=https://api.openai.com/v1
  • Ollama — set COMBFIND_LLM_BASE_URL=http://localhost:11434/v1 and COMBFIND_LLM_API_KEY=ollama
  • LM Studio — set COMBFIND_LLM_BASE_URL=http://localhost:1234/v1
  • Any other OpenAI-compatible server — point COMBFIND_LLM_BASE_URL at its /v1 endpoint

--llm-model is ignored in openai mode; the model is selected via COMBFIND_LLM_MODEL.

Supported languages

Python, Go. More languages can be added via tree-sitter grammars.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

combfind-0.1.14.tar.gz (22.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

combfind-0.1.14-py3-none-any.whl (27.5 kB view details)

Uploaded Python 3

File details

Details for the file combfind-0.1.14.tar.gz.

File metadata

  • Download URL: combfind-0.1.14.tar.gz
  • Upload date:
  • Size: 22.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for combfind-0.1.14.tar.gz
Algorithm Hash digest
SHA256 6faef3bd2726399c64a5c5c671bbb551bbfd88e9fca0d6bac89956b65b0936b1
MD5 fbb67b2d6dad9b3b9004d33b27f3eb3a
BLAKE2b-256 ca3df334f10b18870842680c56e886796a634d810a95e4430e6d33017b842c2a

See more details on using hashes here.

Provenance

The following attestation bundles were made for combfind-0.1.14.tar.gz:

Publisher: release.yml on The127/combfind

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file combfind-0.1.14-py3-none-any.whl.

File metadata

  • Download URL: combfind-0.1.14-py3-none-any.whl
  • Upload date:
  • Size: 27.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for combfind-0.1.14-py3-none-any.whl
Algorithm Hash digest
SHA256 2921eaf4d7e2bfa78e5903cc2833db7da09d46ed081ea642e30fac5df52001b6
MD5 3ca3b5114230ec9173e484d7e7bab012
BLAKE2b-256 06b8df420ac9dbae9e6d70d5b0461f85c259c60d08a735235ac2450519914346

See more details on using hashes here.

Provenance

The following attestation bundles were made for combfind-0.1.14-py3-none-any.whl:

Publisher: release.yml on The127/combfind

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page