Skip to main content

Local-first documentation management for AI-assisted development

Project description

Project Ontos

CI PyPI version Python 3.9+ License: Proprietary

Portable context for the agentic era.

Never explain twice. Own your context.


Contents


The Problem

Context dies in three ways:

  1. AI Amnesia. You explain your architecture to Claude. Then again to ChatGPT. Then again to Cursor. Each starts from zero.

  2. Prototype Graveyards. You build fast in Streamlit, make dozens of product decisions, then rewrite in Next.js. The code is new. The decisions? Lost in old chat logs.

  3. Tribal Knowledge. Your project's "why" lives in Slack threads, abandoned docs, and your head. New collaborators (human or AI) rediscover everything from scratch.

The common thread: context isn't portable. And even when it exists, you don't own it—it's locked in proprietary platforms, unexportable, unversioned, gone when you switch providers.


The Solution

Ontos creates a portable knowledge graph that lives in your repo as markdown files with YAML frontmatter. No cloud service, no vendor lock-in.

Readable, not retrievable. Your context is a glass box—inspectable by humans, followable by AIs. Explicit structure instead of semantic search. You know exactly what the AI sees.

How it works:

  1. Run ontos scaffold to auto-tag your docs with YAML headers (or add them manually if you prefer)
  2. Run ontos map to generate your project's context map
  3. Any AI agent reads the map, loads what's relevant, sees the full decision history
---
id: pricing_strategy
type: strategy
depends_on: [target_audience, mission]
---

The hierarchy (rule of thumb: "If this doc changes, what else breaks?"):

Layer What It Captures Survives Migration?
kernel Why you exist, core values ✅ Always
strategy Goals, audience, approach ✅ Always
product Features, user flows, requirements ✅ Always
atom Implementation details ⚠️ Often rewritten

Your prototype atoms get rewritten. Your product decisions don't. Interface specs and data models often survive—implementation code rarely does.


Philosophy

Intent over automation. You decide what matters. Tagging a session, connecting decisions to documents—this friction is the feature. Curation beats capture.

You own your context. Markdown in your repo, not locked in someone else's platform. No database, no account, no API key. It travels with git clone.

Shared memory over personal memory. What you remember is useless to your teammate or the AI that just opened a fresh session. Ontos encodes knowledge at the repo level. Everyone who clones gets the same brain.

Decisions outlive code. Ontos separates Space (what IS true) from Time (what HAPPENED). Your implementation atoms get rewritten on migration. Your strategy, your product decisions, your session logs—those survive.


The Premise

  • LLMs get better; your tooling should too. Ontos doesn't fight the model—it gives the model better input. As agents improve, structured context becomes more valuable, not less.
  • Platforms won't solve portability for you. Vendor lock-in is a feature, not a bug, for model providers. If you want context that moves freely, you have to own it yourself.
  • "Vibe coding" becomes "context engineering." The bottleneck isn't generating code anymore. It's giving the AI enough context to generate the right code.

Who Ontos Is For

  • Small teams (1-5 devs) switching between AI tools who are tired of re-explaining their project to Claude, then Cursor, then ChatGPT
  • Projects that outlive their prototypes—when you rewrite from Streamlit to Next.js, your decisions should survive the migration
  • Developers who want context to transfer across session resets, tool switches, and team changes
  • Anyone betting on AI-assisted development who needs reliable, portable project memory

The litmus test: Can a new person (or AI) become productive in under 10 minutes?


Use Cases

Multi-AI Workflows

Switch between Claude Code, Cursor, ChatGPT, and Gemini without re-explaining your project. Ontos generates AGENTS.md and .cursorrules so your context activates automatically.

Prototype → Production

Built a demo in Streamlit? When you rewrite in FastAPI or Next.js, your atoms are disposable but your strategy survives. Three weeks of product decisions don't vanish with the old code.

Project Handoffs

Pass a project to another developer or agency. Because you own your context, everything travels with git clone—session logs, context map, decision history. No export wizard, no platform migration, no 2-hour call.

Documentation Health

CI validation catches broken links, circular dependencies, and architectural violations before they become tribal knowledge buried in someone's head.


Quick Start

Requirements: Python 3.9+, inside a git repository

Install:

pip install ontos

Source available at github.com/ohjona/Project-Ontos.

Initialize:

cd your-project
ontos init

This creates your config, generates the context map, installs git hooks, and creates AGENTS.md for AI agent activation.

Activate: Tell any AI agent:

"Ontos" (or "Activate Ontos")

The agent reads AGENTS.md, regenerates the context map, loads relevant files, and confirms what context it has.


Workflow

Command What It Does
"Ontos" Activate context—agent reads the map, loads relevant files
"Archive Ontos" End session—save decisions as a log for next time
"Maintain Ontos" Health check—scan for new files, fix broken links, regenerate map
# CLI equivalents
ontos scaffold     # Auto-tag docs with YAML frontmatter
ontos map          # Generate/update context map
ontos log          # Create a session log
ontos doctor       # Check graph health
ontos agents       # Regenerate AGENTS.md and .cursorrules

Update: pip install --upgrade ontos


Best Practices

  • Start from the top. Define kernel and strategy before creating atoms. The hierarchy exists for a reason.
  • Curate, don't hoard. Not every session needs a log. Archive the ones with decisions that matter.
  • Review scaffold output. Auto-tagging proposes; you decide. The human judgment is the point.
  • Run ontos doctor periodically. Catch broken links and dependency issues before they compound.

What Ontos Is NOT

  • Not a RAG system. We use structural graph traversal, not semantic search. Concepts are curated tags, not vector embeddings. Deterministic beats probabilistic for critical decisions.
  • Not zero-effort. You decide what matters (curation). The tooling handles the paperwork (tagging, validation, map generation).
  • Not a cloud service. Markdown files in your repo. No API keys, no accounts.
  • Not magic. The graph and map are deterministic—same input, same output. What the AI does with that context is still AI.

If you want automatic context capture, use a vector database. If you want reliable, portable, inspectable context, use Ontos.


Roadmap

Version Status Highlights
v3.0.0 ✅ Current ontos agents generates AGENTS.md + .cursorrules, JSON output
v3.1 Next Obsidian compatibility, ontos deinit, concepts → tags mapping
v4.0 Vision MCP as primary interface, full template system, daemon mode

v3.0 transformed Ontos from repo-injected scripts into a pip-installable package with modular CLI architecture.


Documentation


Contributing

Contributions welcome. See CONTRIBUTING.md.


License

Proprietary. All rights reserved. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ontos-3.0.1.tar.gz (136.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ontos-3.0.1-py3-none-any.whl (139.1 kB view details)

Uploaded Python 3

File details

Details for the file ontos-3.0.1.tar.gz.

File metadata

  • Download URL: ontos-3.0.1.tar.gz
  • Upload date:
  • Size: 136.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for ontos-3.0.1.tar.gz
Algorithm Hash digest
SHA256 240a229cb567032564218d4e183aa37bb0be5697e058189e72de830879c6688f
MD5 221a6302b4cf265acd0bf4dacb10b886
BLAKE2b-256 b599162bc77dbe1dac0112766eca90010eab9af963952402ba41cd973e1f0534

See more details on using hashes here.

File details

Details for the file ontos-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: ontos-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 139.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for ontos-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cb96ca05a824308123d16f046d6ba2596fc6a365203ec41903b8effe317ebd7b
MD5 1cc60fbc8107a9f7f3b4d9ca7629c030
BLAKE2b-256 4837692eb1546751bff4368e5e9386eb5f0b2823076c36f63fb787891d17f55e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page