Skip to main content

Clone GitHub repos, build embeddings, store in Chroma, and search.

Project description

kno-sdk

A Python library for cloning, indexing, and semantically searching Git repositories using embeddings (OpenAI or SBERT) and Chroma — plus a high-level agent_query for autonomous code agent.


🚀 Features

  • Clone or update any Git repository with a single call
  • Extract semantic code chunks via Tree-Sitter grammars (functions, classes, methods, etc.)
  • Fallback to line-based chunking for unsupported languages or large files
  • Embed code or text with your choice of:
    • OpenAI's text-embedding-ada-002 via OpenAIEmbeddings
    • Local SBERT model (e.g. microsoft/graphcodebert-base) via SBERTEmbeddings
  • Persist vector store in a .kno/ folder using Chroma
  • Auto-commit & push the embedding database back to your repo
  • Fast similarity search over indexed code chunks
  • Autonomous agent for code analysis via agent_query()

📦 Installation

pip install kno-sdk

🏁 Quickstart

from kno_sdk import clone_and_index, search, EmbeddingMethod

# 1. Clone (or pull) and index a repository
repo_index = clone_and_index(
    repo_url="https://github.com/SyedGhazanferAnwar/NestJs-MovieApp",
    branch="master",
    embedding=EmbeddingMethod.SBERT,      # or EmbeddingMethod.OPENAI
    cloned_repo_base_dir="repos"                      # where to clone locally
)
print("Indexed at:", repo_index.path)
print("Directory snapshot:\n", repo_index.digest)

# 2. Perform semantic search
results = search(
    repo_url="https://github.com/SyedGhazanferAnwar/NestJs-MovieApp",
    branch="master",
    embedding=EmbeddingMethod.SBERT,
    cloned_repo_base_dir="repos",
    query="NestFactory",
    k=5
)
for i, chunk in enumerate(results, 1):
    print(f"--- Result #{i} ---\n{chunk}\n")

# 3. Autonomous Code-Analysis Agent
from kno_sdk import agent_query, EmbeddingMethod, LLMProvider

# First create a repo index
repo_index = clone_and_index(
    repo_url="https://github.com/WebGoat/WebGoat",
    branch="main",
    embedding=EmbeddingMethod.SBERT,
    cloned_repo_base_dir="repos"
)

# Then use the index with agent_query
result = agent_query(
    repo_index=repo_index,
    llm_provider=LLMProvider.ANTHROPIC,
    llm_model="claude-3-haiku-20240307",
    llm_temperature=0.0,
    llm_max_tokens=4096,
    llm_system_prompt="You are a senior code-analysis agent.",
    prompt="Find issues, bugs and vulnerabilities in this repo, and explain each with exact code locations.",
    MODEL_API_KEY="your_api_key_here"
)

print(result)

📖 API Reference

clone_and_index(...) → RepoIndex

Clone (or pull) a repository, embed its files, and persist a Chroma database in .kno folder. Finally, commit & push the .kno/ folder back to the original repo.

def clone_and_index(
    repo_url: str,
    branch: str = "main",
    embedding: EmbeddingMethod = EmbeddingMethod.SBERT,
    cloned_repo_base_dir: str = "."
) -> RepoIndex
  • repo_url — Git HTTPS/SSH URL

  • branch — branch to clone or update (default: main)

  • embedding — EmbeddingMethod.OPENAI or EmbeddingMethod.SBERT

  • base_dir — local directory to clone into (default: current working dir)

Returns a RepoIndex object with:

  • path: pathlib.Path — local clone directory

  • digest: str — textual snapshot of the directory tree

  • vector_store: Chroma — the Chroma collection instance

search(...) → List[str]

Run a similarity search on an existing .kno/ Chroma database.

def search(
    repo_url: str,
    branch: str = "main",
    embedding: EmbeddingMethod = EmbeddingMethod.SBERT,
    query: str = "",
    k: int = 8,
    cloned_repo_base_dir: str = "."
) -> List[str]
  • query — your natural-language or code search prompt

  • k — number of top results to return

Returns a list of the top-k matching code/text chunks.

agent_query(...) → str

High-level agent that clones, indexes, and then iteratively uses tools (search_code, read_file, etc.) plus an LLM to fulfill your prompt.

def agent_query(
    repo_url: str,
    branch: str = "main",
    embedding: EmbeddingMethod = EmbeddingMethod.SBERT,
    cloned_repo_base_dir: str = str(Path.cwd()),
    llm_provider: LLMProvider = LLMProvider.ANTHROPIC,
    llm_model: str = "claude-3-haiku-20240307",
    llm_temperature: float = 0.0,
    llm_max_tokens: int = 4096,
    llm_system_prompt: str = "",
    prompt: str = "",
    MODEL_API_KEY: str = "",
) -> str
  • repo_url, branch, embedding, base_dir — same as above

  • llm_provider — LLMProvider.OPENAI or LLMProvider.ANTHROPIC

  • llm_model — model name (e.g. "gpt-4" or "claude-3-haiku-20240307")

  • llm_temperature, llm_max_tokens — sampling params

  • llm_system_prompt — initial system message for the agent

  • prompt — your user query/task description

  • MODEL_API_KEY — sets OPENAI_API_KEY or ANTHROPIC_API_KEY

Returns the agent's Final Answer as a string.

EmbeddingMethod

class EmbeddingMethod(str, Enum):
    OPENAI = "OpenAIEmbeddings"
    SBERT  = "SBERTEmbeddings"

Choose between OpenAI's hosted embeddings or a local SBERT model.

🔍 How It Works

  1. Clone or PullUses GitPython to clone depth-1 or pull the latest changes.

  2. Directory SnapshotBuilds a small "digest" of files/folders (up to ~1 K tokens).

  3. Chunk Extraction

    • Tree-sitter for language-aware extraction of functions, classes, etc.

    • Fallback to fixed-size line chunks for unknown languages or large files.

  4. Embedding

    • Streams each chunk into your chosen embedding backend.

    • Respects a 16 000-token cap per chunk.

  5. Vector Store

    • Persists embeddings in a namespaced Chroma collection under .kno/.

    • Only indexes files once (skips already-populated collections).

  6. Commit & Push

    • Automatically stages, commits, and pushes .kno/ back to your remote.
  7. Autonomous Agent

  • RAG prompt

  • Tool calls (search_code, read_file, …)

  • Iterative LLM planning & execution

  • Stops on "Final Answer:" or max iterations

⚙️ Configuration

  • Skip directories: .git, node_modules, build, dist, target, .vscode, .kno

  • Skip files: package-lock.json, yarn.lock, .prettierignore

  • Binary extensions: common image, audio, video, archive, font, and binary file types

All of the above can be modified by forking the source and adjusting the skip_dirs, skip_files, and BINARY_EXTS sets.

🔧 Dependencies

🤝 Contributing

  1. Fork this repo

  2. Create your feature branch (git checkout -b feature/AmazingFeature)

  3. Commit your changes (git commit -m 'Add amazing feature')

  4. Push to the branch (git push origin feature/AmazingFeature)

  5. Open a Pull Request

Please run pytest before submitting and follow the existing code style.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kno_sdk-1.4.6.tar.gz (18.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kno_sdk-1.4.6-py2.py3-none-any.whl (15.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file kno_sdk-1.4.6.tar.gz.

File metadata

  • Download URL: kno_sdk-1.4.6.tar.gz
  • Upload date:
  • Size: 18.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for kno_sdk-1.4.6.tar.gz
Algorithm Hash digest
SHA256 7e0bbae7a0dc797351f5596983f318948430eaf36dbfef0768a577e69f7aeee0
MD5 29640e7d44881ae652c4f119cf7952db
BLAKE2b-256 2396dbe9307bfea2e30ab8f4cc3b8e46567e8862dfd92e6aa3b0becb1d510f0f

See more details on using hashes here.

File details

Details for the file kno_sdk-1.4.6-py2.py3-none-any.whl.

File metadata

  • Download URL: kno_sdk-1.4.6-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for kno_sdk-1.4.6-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 7550f31fba1d68ccdbd665ce91e6db464911a401b1107c7d29ed8f0f33c31ca6
MD5 9b554bac52210a2f57759325421a7234
BLAKE2b-256 bcd3fcdbdc8b32443d6c69c488fe2362a73d3d11c0992c42959adf47cc703fd6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page