Unified Python CLI for code, docs, and otel context tooling
Project description
ossctx
ossctx is a unified Python toolkit for building local context services around source code, documentation, and telemetry.
It brings three related systems under one CLI and one package:
ossCtx code— source indexing, dependency analysis, call-graph queries, HTTP API, and MCP serverossCtx docs— document ingestion, website crawling, GraphRAG analysis, search, HTTP API, and MCP serverossCtx otel— trace/log/metric ingestion, observability APIs, OTLP compatibility, and MCP server
A unified ossCtx serve command starts everything — the Web UI, all REST APIs, all three MCP SSE servers, and the OTLP gRPC receiver — on a single HTTP port.
Project structure
common— shared CLI, config, database, middleware, and server helperscodectx— code indexing and graph queriesdocsctx— document ingestion, crawling, GraphRAG analysis, and retrievalotelctx— telemetry ingestion, storage, search, and service graph featuresui— unified web UI and ASGI application factory
Installation
Install with uv:
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
uv pip install -e .
Install optional extras:
uv pip install -e ".[dev,tree-sitter,loaders]"
| Extra | Adds |
|---|---|
dev |
pytest, ruff, mypy, pip-audit |
tree-sitter |
Higher-fidelity code parsers |
loaders |
PDF (pymupdf) and Word (python-docx) ingestion |
Running
Unified server (recommended)
Start every service together:
uv run ossCtx serve
This launches:
| Service | Port | Path |
|---|---|---|
| Web UI | 8070 | http://127.0.0.1:8070/ |
| Code REST API | 8070 | /api/code/... |
| Docs REST API | 8070 | /api/docs/... |
| Otel REST API | 8070 | /api/otel/... |
| Code MCP (SSE) | 8070 | /mcp/code/sse |
| Docs MCP (SSE) | 8070 | /mcp/docs/sse |
| Otel MCP (SSE) | 8070 | /mcp/otel/sse |
| OTLP gRPC | 4317 | grpc://0.0.0.0:4317 |
Override defaults:
uv run ossCtx serve --host 0.0.0.0 --port 8070 --grpc-port 4317 \
--code-db .codecontext.db \
--docs-db .docscontext.db \
--otel-db .otelcontext.db
Run individual subsystems
uv run ossCtx code --help
uv run ossCtx docs --help
uv run ossCtx otel --help
Console commands
| Command | Description |
|---|---|
ossCtx |
Main entrypoint |
codecontext |
Alias for ossCtx code |
docscontext |
Alias for ossCtx docs |
otelcontext |
Alias for ossCtx otel |
ossCtx --version
Quickstart
Index and query code
ossCtx code index . --db .codecontext.db
ossCtx code stats
ossCtx code query entity MyClass
Index documents with GraphRAG
# Index PDF / markdown / HTML
ossCtx docs index ./docs --finalize
# Crawl a docs site and finalise
ossCtx docs index --url https://example.com/docs --max-pages 100 --finalize
# Check stats
ossCtx docs stats
The --finalize flag runs the full GraphRAG pipeline:
- Extracts entities, relationships, and claims from each chunk via LLM
- Embeds chunks and entities with the configured embedding model
- Runs Louvain community detection and generates LLM community summaries
Ingest telemetry
ossCtx otel ingest traces.json
ossCtx otel stats
DocsCtx configuration
DocsCtx uses a .docscontext.yaml config file (or environment variables) for LLM and embedding provider settings.
Ollama (local)
# .docscontext.yaml
chat:
provider: ollama
model: llama3.2
base_url: http://localhost:11434
timeout_seconds: 300
embedding:
provider: ollama
model: nomic-embed-text
base_url: http://localhost:11434
Ollama Cloud (authenticated)
chat:
provider: ollama
model: llama3.2
base_url: https://ollama.com
api_key: "your-ollama-cloud-api-key"
timeout_seconds: 120
embedding:
provider: ollama
model: nomic-embed-text
base_url: https://ollama.com
api_key: "your-ollama-cloud-api-key"
Note: The API key is sent as
Authorization: Bearer <key>on every request. Store it in.docscontext.yaml(which is git-ignored by default) or pass it via theDOCSCTX_CHAT__API_KEYenvironment variable.
Azure OpenAI
chat:
provider: azure
model: gpt-4o
base_url: https://<your-resource>.openai.azure.com
api_key: "<AZURE_OPENAI_API_KEY>"
timeout_seconds: 60
embedding:
provider: azure
model: text-embedding-3-small
base_url: https://<your-resource>.openai.azure.com
api_key: "<AZURE_OPENAI_API_KEY>"
OpenAI
chat:
provider: openai
model: gpt-4o-mini
base_url: https://api.openai.com/v1
api_key: "<OPENAI_API_KEY>"
embedding:
provider: openai
model: text-embedding-3-small
api_key: "<OPENAI_API_KEY>"
All DocsCtx config fields
| Field | Default | Description |
|---|---|---|
chat.provider |
none |
ollama, openai, or azure |
chat.model |
gpt-4o-mini |
Model name |
chat.base_url |
https://api.openai.com/v1 |
API base URL |
chat.api_key |
(empty) | API key (required for cloud providers) |
chat.timeout_seconds |
20 |
HTTP timeout in seconds |
chat.max_retries |
2 |
Number of retry attempts |
embedding.provider |
(empty) | Same options as chat.provider |
embedding.model |
(empty) | Embedding model name |
embedding.base_url |
(empty) | Falls back to chat.base_url if empty |
embedding.api_key |
(empty) | Falls back to chat.api_key if empty |
embedding.batch_size |
20 |
Chunks per embedding batch |
Environment variable overrides
Use double-underscore (__) for nested fields:
export DOCSCTX_CHAT__PROVIDER=ollama
export DOCSCTX_CHAT__MODEL=llama3.2
export DOCSCTX_CHAT__BASE_URL=http://localhost:11434
export DOCSCTX_CHAT__API_KEY=your-key
export DOCSCTX_EMBEDDING__PROVIDER=ollama
export DOCSCTX_EMBEDDING__MODEL=nomic-embed-text
Command reference
ossCtx serve
Unified server — starts all components simultaneously.
ossCtx serve [--host HOST] [--port PORT] [--grpc-port PORT]
[--code-db PATH] [--docs-db PATH] [--otel-db PATH]
Code commands
ossCtx code index <path> [--db PATH] [--max-file-size MB]
ossCtx code query entity <name> [--db PATH]
ossCtx code query deps <file> [--db PATH]
ossCtx code query calls <entity> [--db PATH]
ossCtx code stats [--db PATH]
ossCtx code clean [--db PATH]
ossCtx code serve [--host HOST] [--port PORT] [--db PATH]
ossCtx code mcp [--transport stdio|sse|streamable-http] [--addr HOST:PORT]
ossCtx code setup [--output PATH] [--transport TYPE]
Docs commands
ossCtx docs index <path> [--db PATH] [--finalize]
ossCtx docs index --url URL [--db PATH] [--max-pages N] [--max-depth N] [--finalize]
ossCtx docs stats [--db PATH] [--json]
ossCtx docs serve [--host HOST] [--port PORT] [--db PATH]
ossCtx docs mcp [--transport stdio|sse|streamable-http] [--db PATH]
ossCtx docs setup [--output PATH] [--transport TYPE]
Otel commands
ossCtx otel ingest <path> [--db PATH]
ossCtx otel stats [--db PATH]
ossCtx otel serve [--host HOST] [--http-port PORT] [--grpc-port PORT] [--db PATH]
ossCtx otel mcp [--transport stdio|sse|streamable-http] [--addr HOST:PORT]
ossCtx otel setup [--output PATH] [--transport TYPE]
MCP integration
Unified SSE (via ossCtx serve)
When running the unified server, all three MCP servers are available on port 8070:
{
"servers": {
"ossctx-code": { "type": "sse", "url": "http://127.0.0.1:8070/mcp/code/sse" },
"ossctx-docs": { "type": "sse", "url": "http://127.0.0.1:8070/mcp/docs/sse" },
"ossctx-otel": { "type": "sse", "url": "http://127.0.0.1:8070/mcp/otel/sse" }
}
}
stdio (per subsystem)
ossCtx code mcp --transport stdio
ossCtx docs mcp --transport stdio
ossCtx otel mcp --transport stdio
Generate .vscode/mcp.json config:
ossCtx code setup --output .vscode/mcp.json
ossCtx docs setup --output .vscode/mcp.json
ossCtx otel setup --output .vscode/mcp.json
Supported formats
Code languages
Python, Go, JavaScript, TypeScript, Java, Rust, C#, Ruby, SQL, Kotlin, Scala, Bash, Lua, Perl, R, HTML, CSS, Markdown, JSON, XML, YAML, TOML, HCL, Dockerfile
The tree-sitter extra improves fidelity for supported languages.
Document formats
Built-in: .txt, .md, .markdown, .html, .htm
With loaders extra: .pdf, .docx
Default storage and ports
| Resource | Default |
|---|---|
| Code DB | .codecontext.db |
| Docs DB | .docscontext.db |
| Otel DB | .otelcontext.db |
| Unified HTTP | 8070 |
| Code API | 8080 |
| Docs API | 8090 |
| Otel HTTP | 8088 |
| OTLP gRPC | 4317 |
| Otel archive | .otel_archive/ |
Development
python -m pip install -e ".[dev,tree-sitter,loaders]"
python -m pytest -q
ruff check .
mypy .
License
MIT — see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ossctx-0.1.0b9.tar.gz.
File metadata
- Download URL: ossctx-0.1.0b9.tar.gz
- Upload date:
- Size: 380.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5a132301940335b5c38f30a3470d3ad8eb74ab498835bde9e6a50234a20e3b8d
|
|
| MD5 |
3bc42352a09373e6cf53462b3092bd99
|
|
| BLAKE2b-256 |
769834043de37eada1a14b193f65eb363a368d7533dd33b1a30b7a112f500a6c
|
Provenance
The following attestation bundles were made for ossctx-0.1.0b9.tar.gz:
Publisher:
publish.yml on RandomCodeSpace/ossctx
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ossctx-0.1.0b9.tar.gz -
Subject digest:
5a132301940335b5c38f30a3470d3ad8eb74ab498835bde9e6a50234a20e3b8d - Sigstore transparency entry: 1134870425
- Sigstore integration time:
-
Permalink:
RandomCodeSpace/ossctx@cc3eb5b93829816c588231c20c62a5635bc5571e -
Branch / Tag:
refs/heads/main - Owner: https://github.com/RandomCodeSpace
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@cc3eb5b93829816c588231c20c62a5635bc5571e -
Trigger Event:
push
-
Statement type:
File details
Details for the file ossctx-0.1.0b9-py3-none-any.whl.
File metadata
- Download URL: ossctx-0.1.0b9-py3-none-any.whl
- Upload date:
- Size: 171.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ef0fb554d1eace6a5f098dfc1dd2711d146ff625fa2f3345494cc646f2f55087
|
|
| MD5 |
7296fec09cdb44cb16f176d872c3dc81
|
|
| BLAKE2b-256 |
13035eaddc615f43eba3383f868267515ad54489c775332a9febe09a00ecced2
|
Provenance
The following attestation bundles were made for ossctx-0.1.0b9-py3-none-any.whl:
Publisher:
publish.yml on RandomCodeSpace/ossctx
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ossctx-0.1.0b9-py3-none-any.whl -
Subject digest:
ef0fb554d1eace6a5f098dfc1dd2711d146ff625fa2f3345494cc646f2f55087 - Sigstore transparency entry: 1134870555
- Sigstore integration time:
-
Permalink:
RandomCodeSpace/ossctx@cc3eb5b93829816c588231c20c62a5635bc5571e -
Branch / Tag:
refs/heads/main - Owner: https://github.com/RandomCodeSpace
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@cc3eb5b93829816c588231c20c62a5635bc5571e -
Trigger Event:
push
-
Statement type: