Skip to main content

Unified Python CLI for code, docs, and otel context tooling

Project description

ossctx

ossctx is a unified Python toolkit for building local context services around source code, documentation, and telemetry.

It brings three related systems under one CLI and one package:

  • ossCtx code — source indexing, dependency analysis, call-graph queries, HTTP API, and MCP server
  • ossCtx docs — document ingestion, website crawling, GraphRAG analysis, search, HTTP API, and MCP server
  • ossCtx otel — trace/log/metric ingestion, observability APIs, OTLP compatibility, and MCP server

A unified ossCtx serve command starts everything — the Web UI, all REST APIs, all three MCP SSE servers, and the OTLP gRPC receiver — on a single HTTP port.

Project structure

  • common — shared CLI, config, database, middleware, and server helpers
  • codectx — code indexing and graph queries
  • docsctx — document ingestion, crawling, GraphRAG analysis, and retrieval
  • otelctx — telemetry ingestion, storage, search, and service graph features
  • ui — unified web UI and ASGI application factory

Installation

Install with uv:

uv venv
source .venv/bin/activate   # On Windows: .venv\Scripts\activate
uv pip install -e .

Install optional extras:

uv pip install -e ".[dev,tree-sitter,loaders]"
Extra Adds
dev pytest, ruff, mypy, pip-audit
tree-sitter Higher-fidelity code parsers
loaders PDF (pymupdf) and Word (python-docx) ingestion

Running

Unified server (recommended)

Start every service together:

uv run ossCtx serve

This launches:

Service Port Path
Web UI 8070 http://127.0.0.1:8070/
Code REST API 8070 /api/code/...
Docs REST API 8070 /api/docs/...
Otel REST API 8070 /api/otel/...
Code MCP (SSE) 8070 /mcp/code/sse
Docs MCP (SSE) 8070 /mcp/docs/sse
Otel MCP (SSE) 8070 /mcp/otel/sse
OTLP gRPC 4317 grpc://0.0.0.0:4317

Override defaults:

uv run ossCtx serve --host 0.0.0.0 --port 8070 --grpc-port 4317 \
  --code-db .codecontext.db \
  --docs-db .docscontext.db \
  --otel-db .otelcontext.db

Run individual subsystems

uv run ossCtx code --help
uv run ossCtx docs --help
uv run ossCtx otel --help

Console commands

Command Description
ossCtx Main entrypoint
codecontext Alias for ossCtx code
docscontext Alias for ossCtx docs
otelcontext Alias for ossCtx otel
ossCtx --version

Quickstart

Index and query code

ossCtx code index . --db .codecontext.db
ossCtx code stats
ossCtx code query entity MyClass

Index documents with GraphRAG

# Index PDF / markdown / HTML
ossCtx docs index ./docs --finalize

# Crawl a docs site and finalise
ossCtx docs index --url https://example.com/docs --max-pages 100 --finalize

# Check stats
ossCtx docs stats

The --finalize flag runs the full GraphRAG pipeline:

  1. Extracts entities, relationships, and claims from each chunk via LLM
  2. Embeds chunks and entities with the configured embedding model
  3. Runs Louvain community detection and generates LLM community summaries

Ingest telemetry

ossCtx otel ingest traces.json
ossCtx otel stats

DocsCtx configuration

DocsCtx uses a .docscontext.yaml config file (or environment variables) for LLM and embedding provider settings.

Ollama (local)

# .docscontext.yaml
chat:
  provider: ollama
  model: llama3.2
  base_url: http://localhost:11434
  timeout_seconds: 300

embedding:
  provider: ollama
  model: nomic-embed-text
  base_url: http://localhost:11434

Ollama Cloud (authenticated)

chat:
  provider: ollama
  model: llama3.2
  base_url: https://ollama.com
  api_key: "your-ollama-cloud-api-key"
  timeout_seconds: 120

embedding:
  provider: ollama
  model: nomic-embed-text
  base_url: https://ollama.com
  api_key: "your-ollama-cloud-api-key"

Note: The API key is sent as Authorization: Bearer <key> on every request. Store it in .docscontext.yaml (which is git-ignored by default) or pass it via the DOCSCTX_CHAT__API_KEY environment variable.

Azure OpenAI

chat:
  provider: azure
  model: gpt-4o
  base_url: https://<your-resource>.openai.azure.com
  api_key: "<AZURE_OPENAI_API_KEY>"
  timeout_seconds: 60

embedding:
  provider: azure
  model: text-embedding-3-small
  base_url: https://<your-resource>.openai.azure.com
  api_key: "<AZURE_OPENAI_API_KEY>"

OpenAI

chat:
  provider: openai
  model: gpt-4o-mini
  base_url: https://api.openai.com/v1
  api_key: "<OPENAI_API_KEY>"

embedding:
  provider: openai
  model: text-embedding-3-small
  api_key: "<OPENAI_API_KEY>"

All DocsCtx config fields

Field Default Description
chat.provider none ollama, openai, or azure
chat.model gpt-4o-mini Model name
chat.base_url https://api.openai.com/v1 API base URL
chat.api_key (empty) API key (required for cloud providers)
chat.timeout_seconds 20 HTTP timeout in seconds
chat.max_retries 2 Number of retry attempts
embedding.provider (empty) Same options as chat.provider
embedding.model (empty) Embedding model name
embedding.base_url (empty) Falls back to chat.base_url if empty
embedding.api_key (empty) Falls back to chat.api_key if empty
embedding.batch_size 20 Chunks per embedding batch

Environment variable overrides

Use double-underscore (__) for nested fields:

export DOCSCTX_CHAT__PROVIDER=ollama
export DOCSCTX_CHAT__MODEL=llama3.2
export DOCSCTX_CHAT__BASE_URL=http://localhost:11434
export DOCSCTX_CHAT__API_KEY=your-key
export DOCSCTX_EMBEDDING__PROVIDER=ollama
export DOCSCTX_EMBEDDING__MODEL=nomic-embed-text

Command reference

ossCtx serve

Unified server — starts all components simultaneously.

ossCtx serve [--host HOST] [--port PORT] [--grpc-port PORT]
             [--code-db PATH] [--docs-db PATH] [--otel-db PATH]

Code commands

ossCtx code index <path> [--db PATH] [--max-file-size MB]
ossCtx code query entity <name> [--db PATH]
ossCtx code query deps <file> [--db PATH]
ossCtx code query calls <entity> [--db PATH]
ossCtx code stats [--db PATH]
ossCtx code clean [--db PATH]
ossCtx code serve [--host HOST] [--port PORT] [--db PATH]
ossCtx code mcp [--transport stdio|sse|streamable-http] [--addr HOST:PORT]
ossCtx code setup [--output PATH] [--transport TYPE]

Docs commands

ossCtx docs index <path> [--db PATH] [--finalize]
ossCtx docs index --url URL [--db PATH] [--max-pages N] [--max-depth N] [--finalize]
ossCtx docs stats [--db PATH] [--json]
ossCtx docs serve [--host HOST] [--port PORT] [--db PATH]
ossCtx docs mcp [--transport stdio|sse|streamable-http] [--db PATH]
ossCtx docs setup [--output PATH] [--transport TYPE]

Otel commands

ossCtx otel ingest <path> [--db PATH]
ossCtx otel stats [--db PATH]
ossCtx otel serve [--host HOST] [--http-port PORT] [--grpc-port PORT] [--db PATH]
ossCtx otel mcp [--transport stdio|sse|streamable-http] [--addr HOST:PORT]
ossCtx otel setup [--output PATH] [--transport TYPE]

MCP integration

Unified SSE (via ossCtx serve)

When running the unified server, all three MCP servers are available on port 8070:

{
  "servers": {
    "ossctx-code": { "type": "sse", "url": "http://127.0.0.1:8070/mcp/code/sse" },
    "ossctx-docs": { "type": "sse", "url": "http://127.0.0.1:8070/mcp/docs/sse" },
    "ossctx-otel": { "type": "sse", "url": "http://127.0.0.1:8070/mcp/otel/sse" }
  }
}

stdio (per subsystem)

ossCtx code mcp --transport stdio
ossCtx docs mcp --transport stdio
ossCtx otel mcp --transport stdio

Generate .vscode/mcp.json config:

ossCtx code setup --output .vscode/mcp.json
ossCtx docs setup --output .vscode/mcp.json
ossCtx otel setup --output .vscode/mcp.json

Supported formats

Code languages

Python, Go, JavaScript, TypeScript, Java, Rust, C#, Ruby, SQL, Kotlin, Scala, Bash, Lua, Perl, R, HTML, CSS, Markdown, JSON, XML, YAML, TOML, HCL, Dockerfile

The tree-sitter extra improves fidelity for supported languages.

Document formats

Built-in: .txt, .md, .markdown, .html, .htm

With loaders extra: .pdf, .docx

Default storage and ports

Resource Default
Code DB .codecontext.db
Docs DB .docscontext.db
Otel DB .otelcontext.db
Unified HTTP 8070
Code API 8080
Docs API 8090
Otel HTTP 8088
OTLP gRPC 4317
Otel archive .otel_archive/

Development

python -m pip install -e ".[dev,tree-sitter,loaders]"
python -m pytest -q
ruff check .
mypy .

License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ossctx-0.1.0b11.tar.gz (433.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ossctx-0.1.0b11-py3-none-any.whl (499.4 kB view details)

Uploaded Python 3

File details

Details for the file ossctx-0.1.0b11.tar.gz.

File metadata

  • Download URL: ossctx-0.1.0b11.tar.gz
  • Upload date:
  • Size: 433.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ossctx-0.1.0b11.tar.gz
Algorithm Hash digest
SHA256 bb44db9651d6a6228585766cd022c00d60a8d03e97dda1e7597062749bdd71e0
MD5 d7cc170429c86f3d9170b355494defd7
BLAKE2b-256 1e0cd7664a27034e93081d5e5373aa0ea1298bbe36af4e6cf57310cc20ebceea

See more details on using hashes here.

Provenance

The following attestation bundles were made for ossctx-0.1.0b11.tar.gz:

Publisher: publish.yml on RandomCodeSpace/ossctx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ossctx-0.1.0b11-py3-none-any.whl.

File metadata

  • Download URL: ossctx-0.1.0b11-py3-none-any.whl
  • Upload date:
  • Size: 499.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ossctx-0.1.0b11-py3-none-any.whl
Algorithm Hash digest
SHA256 4db2d38af3ef9b3451cf4fe8c1f8ad444b9945dc91ba3337e57b5fe2e44e351f
MD5 a975cc8b2d843f4087310a5690748214
BLAKE2b-256 1e3db255dd6e20245360cacb811105fdcfa39c6e675e6870f0e31924e4f47699

See more details on using hashes here.

Provenance

The following attestation bundles were made for ossctx-0.1.0b11-py3-none-any.whl:

Publisher: publish.yml on RandomCodeSpace/ossctx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page