Skip to main content

Local-first documentation management for AI-assisted development

Project description

Project Ontos

CI PyPI version Python 3.9+ License: Proprietary GitHub

Portable context for the agentic era.

Never explain twice. Own your context.


Contents


The Problem

Context dies in three ways:

  1. AI Amnesia. You explain your architecture to Claude. Then again to ChatGPT. Then again to Cursor. Each starts from zero.

  2. Prototype Graveyards. You build fast in Streamlit, make dozens of product decisions, then rewrite in Next.js. The code is new. The decisions? Lost in old chat logs.

  3. Tribal Knowledge. Your project's "why" lives in Slack threads, abandoned docs, and your head. New collaborators (human or AI) rediscover everything from scratch.

The common thread: context isn't portable. And even when it exists, you don't own it—it's locked in proprietary platforms, unexportable, unversioned, gone when you switch providers.


The Solution

Ontos creates a portable knowledge graph that lives in your repo as markdown files with YAML frontmatter. No cloud service, no vendor lock-in.

Readable, not retrievable. Your context is a glass box—inspectable by humans, followable by AIs. Explicit structure instead of semantic search. You know exactly what the AI sees.

How it works:

  1. Run ontos scaffold to auto-tag your docs with YAML headers (or add them manually if you prefer)
  2. Run ontos map to generate your project's context map
  3. Any AI agent reads the map, loads what's relevant, sees the full decision history
---
id: pricing_strategy
type: strategy
depends_on: [target_audience, mission]
---

The hierarchy (rule of thumb: "If this doc changes, what else breaks?"):

Layer What It Captures Survives Migration?
kernel Why you exist, core values ✅ Always
strategy Goals, audience, approach ✅ Always
product Features, user flows, requirements ✅ Always
atom Implementation details ⚠️ Often rewritten

Your prototype atoms get rewritten. Your product decisions don't. Interface specs and data models often survive—implementation code rarely does.


Philosophy

Intent over automation. You decide what matters. Tagging a session, connecting decisions to documents—this friction is the feature. Curation beats capture.

You own your context. Markdown in your repo, not locked in someone else's platform. No database, no account, no API key. It travels with git clone.

Shared memory over personal memory. What you remember is useless to your teammate or the AI that just opened a fresh session. Ontos encodes knowledge at the repo level. Everyone who clones gets the same brain.

Decisions outlive code. Ontos separates Space (what IS true) from Time (what HAPPENED). Your implementation atoms get rewritten on migration. Your strategy, your product decisions, your session logs—those survive.


The Premise

  • LLMs get better; your tooling should too. Ontos doesn't fight the model—it gives the model better input. As agents improve, structured context becomes more valuable, not less.
  • Platforms won't solve portability for you. Vendor lock-in is a feature, not a bug, for model providers. If you want context that moves freely, you have to own it yourself.
  • "Vibe coding" becomes "context engineering." The bottleneck isn't generating code anymore. It's giving the AI enough context to generate the right code.

Who Ontos Is For

  • Small teams (1-5 devs) switching between AI tools who are tired of re-explaining their project to Claude, then Cursor, then ChatGPT
  • Projects that outlive their prototypes—when you rewrite from Streamlit to Next.js, your decisions should survive the migration
  • Developers who want context to transfer across session resets, tool switches, and team changes
  • Anyone betting on AI-assisted development who needs reliable, portable project memory

The litmus test: Can a new person (or AI) become productive in under 10 minutes?


Use Cases

Multi-AI Workflows

Switch between Claude Code, Cursor, ChatGPT, and Gemini without re-explaining your project. Ontos generates AGENTS.md and .cursorrules so your context activates automatically.

Prototype → Production

Built a demo in Streamlit? When you rewrite in FastAPI or Next.js, your atoms are disposable but your strategy survives. Three weeks of product decisions don't vanish with the old code.

Project Handoffs

Pass a project to another developer or agency. Because you own your context, everything travels with git clone—session logs, context map, decision history. No export wizard, no platform migration, no 2-hour call.

Documentation Health

CI validation catches broken links, circular dependencies, and architectural violations before they become tribal knowledge buried in someone's head.


Quick Start

Requirements: Python 3.9+, inside a git repository

Install:

pip install ontos

Source available at github.com/ohjona/Project-Ontos.

Initialize:

cd your-project
ontos init

This creates your config, generates the context map, installs git hooks, and creates AGENTS.md for AI agent activation.

Activate: Tell any AI agent:

"Ontos" (or "Activate Ontos")

The agent reads AGENTS.md, regenerates the context map, loads relevant files, and confirms what context it has.


Workflow

Command What It Does
"Ontos" Activate context—agent reads the map, loads relevant files
"Archive Ontos" End session—save decisions as a log for next time
"Maintain Ontos" Health check—scan for new files, fix broken links, regenerate map
# CLI equivalents
ontos scaffold     # Auto-tag docs with YAML frontmatter
ontos map          # Generate/update context map
ontos log          # Create a session log
ontos doctor       # Check graph health
ontos agents       # Regenerate AGENTS.md and .cursorrules

Update: pip install --upgrade ontos


Best Practices

  • Start from the top. Define kernel and strategy before creating atoms. The hierarchy exists for a reason.
  • Curate, don't hoard. Not every session needs a log. Archive the ones with decisions that matter.
  • Review scaffold output. Auto-tagging proposes; you decide. The human judgment is the point.
  • Run ontos doctor periodically. Catch broken links and dependency issues before they compound.

What Ontos Is NOT

  • Not a RAG system. We use structural graph traversal, not semantic search. Concepts are curated tags, not vector embeddings. Deterministic beats probabilistic for critical decisions.
  • Not zero-effort. You decide what matters (curation). The tooling handles the paperwork (tagging, validation, map generation).
  • Not a cloud service. Markdown files in your repo. No API keys, no accounts.
  • Not magic. The graph and map are deterministic—same input, same output. What the AI does with that context is still AI.

If you want automatic context capture, use a vector database. If you want reliable, portable, inspectable context, use Ontos.


Roadmap

Version Status Highlights
v3.0.0 ✅ Current ontos agents generates AGENTS.md + .cursorrules, JSON output
v3.1 Next Obsidian compatibility, ontos deinit, concepts → tags mapping
v4.0 Vision MCP as primary interface, full template system, daemon mode

v3.0 transformed Ontos from repo-injected scripts into a pip-installable package with modular CLI architecture.


Documentation


Feedback

This is a source-available project. The code is public for transparency, not contribution.

Welcome:

  • Bug reports and feature requests via GitHub Issues
  • Questions and feedback

Not accepted:

  • Pull requests (proprietary codebase)

License

Proprietary. All rights reserved. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ontos-3.0.2.tar.gz (135.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ontos-3.0.2-py3-none-any.whl (139.2 kB view details)

Uploaded Python 3

File details

Details for the file ontos-3.0.2.tar.gz.

File metadata

  • Download URL: ontos-3.0.2.tar.gz
  • Upload date:
  • Size: 135.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ontos-3.0.2.tar.gz
Algorithm Hash digest
SHA256 1dd9b23d2548dea172741535c1ce76f786b4c836eb14496df6d31c8fb9e2db85
MD5 150aa90c8ba3cf4a395804c9cad6ca06
BLAKE2b-256 47c82f7920dd4869f466673e5db81e134fb1ee1a41e687956bb5f3524893b9bd

See more details on using hashes here.

Provenance

The following attestation bundles were made for ontos-3.0.2.tar.gz:

Publisher: publish.yml on ohjona/Project-Ontos

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ontos-3.0.2-py3-none-any.whl.

File metadata

  • Download URL: ontos-3.0.2-py3-none-any.whl
  • Upload date:
  • Size: 139.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ontos-3.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 314b748843435030e83592434b78361fa99bed2b4c6cdbc02b8097b8d9a4148d
MD5 981e9fefd85a3c36452bfd4bb0a63b49
BLAKE2b-256 22e282683a167defe2b118ded551c3f0ee47361bb326a9d8292fc6199038d66c

See more details on using hashes here.

Provenance

The following attestation bundles were made for ontos-3.0.2-py3-none-any.whl:

Publisher: publish.yml on ohjona/Project-Ontos

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page