Skip to main content

Clone GitHub repos, build embeddings, store in Chroma, and search.

Project description

kno-sdk

A Python library for cloning, indexing, and semantically searching Git repositories using embeddings (OpenAI or SBERT) and Chroma — plus a high-level agent_query for autonomous code agent.


🚀 Features

  • Clone or update any Git repository with a single call
  • Extract semantic code chunks via Tree-Sitter grammars (functions, classes, methods, etc.)
  • Fallback to line-based chunking for unsupported languages or large files
  • Embed code or text with your choice of:
    • OpenAI's text-embedding-ada-002 via OpenAIEmbeddings
    • Local SBERT model (e.g. microsoft/graphcodebert-base) via SBERTEmbeddings
  • Persist vector store in a .kno/ folder using Chroma
  • Auto-commit & push the embedding database back to your repo
  • Fast similarity search over indexed code chunks
  • Autonomous agent for code analysis via agent_query()

📦 Installation

pip install kno-sdk

🏁 Quickstart

from kno_sdk import clone_and_index, search, EmbeddingMethod

# 1. Clone (or pull) and index a repository
repo_index = clone_and_index(
    repo_url="https://github.com/SyedGhazanferAnwar/NestJs-MovieApp",
    branch="master",
    embedding=EmbeddingMethod.SBERT,      # or EmbeddingMethod.OPENAI
    cloned_repo_base_dir="repos"                      # where to clone locally
)
print("Indexed at:", repo_index.path)
print("Directory snapshot:\n", repo_index.digest)

# 2. Perform semantic search
results = search(
    repo_url="https://github.com/SyedGhazanferAnwar/NestJs-MovieApp",
    branch="master",
    embedding=EmbeddingMethod.SBERT,
    cloned_repo_base_dir="repos",
    query="NestFactory",
    k=5
)
for i, chunk in enumerate(results, 1):
    print(f"--- Result #{i} ---\n{chunk}\n")

# 3. Autonomous Code-Analysis Agent
from kno_sdk import agent_query, EmbeddingMethod, LLMProvider

# First create a repo index
repo_index = clone_and_index(
    repo_url="https://github.com/WebGoat/WebGoat",
    branch="main",
    embedding=EmbeddingMethod.SBERT,
    cloned_repo_base_dir="repos"
)

# Then use the index with agent_query
result = agent_query(
    repo_index=repo_index,
    llm_provider=LLMProvider.ANTHROPIC,
    llm_model="claude-3-haiku-20240307",
    llm_temperature=0.0,
    llm_max_tokens=4096,
    llm_system_prompt="You are a senior code-analysis agent.",
    prompt="Find issues, bugs and vulnerabilities in this repo, and explain each with exact code locations.",
    MODEL_API_KEY="your_api_key_here"
)

print(result)

📖 API Reference

clone_and_index(...) → RepoIndex

Clone (or pull) a repository, embed its files, and persist a Chroma database in .kno folder. Finally, commit & push the .kno/ folder back to the original repo.

def clone_and_index(
    repo_url: str,
    branch: str = "main",
    embedding: EmbeddingMethod = EmbeddingMethod.SBERT,
    cloned_repo_base_dir: str = "."
) -> RepoIndex
  • repo_url — Git HTTPS/SSH URL

  • branch — branch to clone or update (default: main)

  • embedding — EmbeddingMethod.OPENAI or EmbeddingMethod.SBERT

  • base_dir — local directory to clone into (default: current working dir)

Returns a RepoIndex object with:

  • path: pathlib.Path — local clone directory

  • digest: str — textual snapshot of the directory tree

  • vector_store: Chroma — the Chroma collection instance

search(...) → List[str]

Run a similarity search on an existing .kno/ Chroma database.

def search(
    repo_url: str,
    branch: str = "main",
    embedding: EmbeddingMethod = EmbeddingMethod.SBERT,
    query: str = "",
    k: int = 8,
    cloned_repo_base_dir: str = "."
) -> List[str]
  • query — your natural-language or code search prompt

  • k — number of top results to return

Returns a list of the top-k matching code/text chunks.

agent_query(...) → str

High-level agent that clones, indexes, and then iteratively uses tools (search_code, read_file, etc.) plus an LLM to fulfill your prompt.

def agent_query(
    repo_url: str,
    branch: str = "main",
    embedding: EmbeddingMethod = EmbeddingMethod.SBERT,
    cloned_repo_base_dir: str = str(Path.cwd()),
    llm_provider: LLMProvider = LLMProvider.ANTHROPIC,
    llm_model: str = "claude-3-haiku-20240307",
    llm_temperature: float = 0.0,
    llm_max_tokens: int = 4096,
    llm_system_prompt: str = "",
    prompt: str = "",
    MODEL_API_KEY: str = "",
) -> str
  • repo_url, branch, embedding, base_dir — same as above

  • llm_provider — LLMProvider.OPENAI or LLMProvider.ANTHROPIC

  • llm_model — model name (e.g. "gpt-4" or "claude-3-haiku-20240307")

  • llm_temperature, llm_max_tokens — sampling params

  • llm_system_prompt — initial system message for the agent

  • prompt — your user query/task description

  • MODEL_API_KEY — sets OPENAI_API_KEY or ANTHROPIC_API_KEY

Returns the agent's Final Answer as a string.

EmbeddingMethod

class EmbeddingMethod(str, Enum):
    OPENAI = "OpenAIEmbeddings"
    SBERT  = "SBERTEmbeddings"

Choose between OpenAI's hosted embeddings or a local SBERT model.

🔍 How It Works

  1. Clone or PullUses GitPython to clone depth-1 or pull the latest changes.

  2. Directory SnapshotBuilds a small "digest" of files/folders (up to ~1 K tokens).

  3. Chunk Extraction

    • Tree-sitter for language-aware extraction of functions, classes, etc.

    • Fallback to fixed-size line chunks for unknown languages or large files.

  4. Embedding

    • Streams each chunk into your chosen embedding backend.

    • Respects a 16 000-token cap per chunk.

  5. Vector Store

    • Persists embeddings in a namespaced Chroma collection under .kno/.

    • Only indexes files once (skips already-populated collections).

  6. Commit & Push

    • Automatically stages, commits, and pushes .kno/ back to your remote.
  7. Autonomous Agent

  • RAG prompt

  • Tool calls (search_code, read_file, …)

  • Iterative LLM planning & execution

  • Stops on "Final Answer:" or max iterations

⚙️ Configuration

  • Skip directories: .git, node_modules, build, dist, target, .vscode, .kno

  • Skip files: package-lock.json, yarn.lock, .prettierignore

  • Binary extensions: common image, audio, video, archive, font, and binary file types

All of the above can be modified by forking the source and adjusting the skip_dirs, skip_files, and BINARY_EXTS sets.

🔧 Dependencies

🤝 Contributing

  1. Fork this repo

  2. Create your feature branch (git checkout -b feature/AmazingFeature)

  3. Commit your changes (git commit -m 'Add amazing feature')

  4. Push to the branch (git push origin feature/AmazingFeature)

  5. Open a Pull Request

Please run pytest before submitting and follow the existing code style.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kno_sdk-1.4.8.tar.gz (18.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kno_sdk-1.4.8-py2.py3-none-any.whl (15.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file kno_sdk-1.4.8.tar.gz.

File metadata

  • Download URL: kno_sdk-1.4.8.tar.gz
  • Upload date:
  • Size: 18.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for kno_sdk-1.4.8.tar.gz
Algorithm Hash digest
SHA256 f856810e2f867214d8adfdb12d9d348c4b4321b70d4091c3b03e4bb44de15270
MD5 8ce27cf0eef39789e78b7c451f989c43
BLAKE2b-256 6e312921375efa1480b3ef5f7bbb93b20111ac351a014ba8dce8b1b692e0c26b

See more details on using hashes here.

File details

Details for the file kno_sdk-1.4.8-py2.py3-none-any.whl.

File metadata

  • Download URL: kno_sdk-1.4.8-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for kno_sdk-1.4.8-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 fc663ad7c88ed8766bb972ec5cf155806853384a476297a1ceb972167a672d78
MD5 92e29c82b10fe0de675ee49843e9ea31
BLAKE2b-256 19afd7cfcfd0f9e47ce25c7e7c42d885f7e6c85b5061aacc88bb4d5bd826c54e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page