Skip to main content

A context-graph framework for building production-ready AI applications.

Project description

KayGraph

A domain-specific language (DSL) for building context-aware AI applications.

PyPI version License: AGPL v3 Python


๐Ÿ“š New to KayGraph?

Choose your path:

You are... Start here
๐Ÿ‘ค Human Developer Follow the 10-minute quickstart below
๐Ÿค– AI Coding Agent Load LLM_CONTEXT_KAYGRAPH_DSL.md first
๐ŸŽฏ Task-focused "I need to build X" โ†’ QUICK_FINDER.md
๐Ÿ“– Exploring all examples Browse 70 workbooks in 16 categories
โš ๏ธ Debugging/stuck Check Common Patterns & Errors

What is KayGraph?

KayGraph is an opinionated DSL for expressing business problems as AI agent pipelines. Think of it as building blocks for AI workflows - a 500-line core that provides powerful abstractions without the bloat.

Core Philosophy:

  • ๐ŸŽฏ DSL-First: Express complex AI workflows declaratively
  • ๐Ÿชถ Zero Dependencies: Pure Python standard library (500 lines of core code)
  • ๐Ÿ”ง Bring Your Own Tools: Works with any LLM, database, or service
  • ๐Ÿ“ฆ Production-Ready: 70 battle-tested examples across 16 categories

Quick Start

Installation

# Using pip
pip install kaygraph

# Or from source
git clone https://github.com/KayOS-AI/KayGraph.git
cd KayGraph
pip install -e .

Your First KayGraph Workflow

from kaygraph import Node, Graph

class AnalyzeNode(Node):
    def prep(self, shared):
        """Phase 1: Read from shared context"""
        return shared.get("text")

    def exec(self, text):
        """Phase 2: Execute logic (LLM call, API, etc.)"""
        return f"Analyzed: {text}"

    def post(self, shared, prep_res, exec_res):
        """Phase 3: Write results back to shared context"""
        shared["result"] = exec_res
        return None  # End of workflow

# Build and run
analyze = AnalyzeNode()
graph = Graph(analyze)

shared = {"text": "Hello KayGraph!"}
graph.run(shared)
print(shared["result"])  # "Analyzed: Hello KayGraph!"

That's it! Three phases: prep() โ†’ exec() โ†’ post()


For Humans: Learning Path

1. Start with the Basics (10 minutes)

# Try the simplest example
cd workbooks/01-getting-started/kaygraph-hello-world
python main.py

2. Explore by Use Case

Use the task-based finder to jump to what you need:

๐Ÿ“‹ workbooks/QUICK_FINDER.md - "I need to build..."

  • An AI Agent โ†’ Examples + patterns
  • A Chatbot โ†’ Chat patterns
  • A RAG System โ†’ Retrieval patterns
  • Batch Processing โ†’ Data pipeline patterns
  • Production API โ†’ Deployment examples

3. Browse All 70 Examples

๐Ÿ“š workbooks/WORKBOOK_INDEX_CONSOLIDATED.md - Complete catalog

16 Categories:

  1. Getting Started (1)
  2. Core Patterns (2)
  3. Batch Processing (5)
  4. AI Agents (9)
  5. Workflows (12)
  6. AI Reasoning (4)
  7. Chat & Conversation (4)
  8. Memory Systems (3)
  9. RAG & Retrieval (1)
  10. Code Development (2)
  11. Data & SQL (4)
  12. Tools Integration (7)
  13. Production & Monitoring (8)
  14. UI/UX (4)
  15. Streaming & Realtime (2)
  16. Advanced Patterns (2)

For Coding Agents: DSL Reference

๐Ÿค– LLM_CONTEXT_KAYGRAPH_DSL.md - Complete DSL specification for AI agents

This document contains everything a coding agent needs to:

  • Understand the 3-phase node lifecycle
  • Build graphs with proper action routing
  • Use all node types (Async, Batch, Parallel, Validated, Metrics)
  • Follow production patterns
  • Avoid common anti-patterns

For AI Assistants (Claude, GPT-4, etc.):

Load the LLM_CONTEXT_KAYGRAPH_DSL.md file to understand KayGraph's
domain-specific language and generate production-ready code.

Core Concepts (5-Minute Overview)

The 3-Phase Node Lifecycle

Every node follows this pattern:

class MyNode(Node):
    def prep(self, shared):
        """
        Phase 1: READ from shared store
        - Gather data needed for execution
        - Access shared context
        - Return data for exec()
        """
        return shared.get("input_data")

    def exec(self, prep_res):
        """
        Phase 2: EXECUTE logic (NO shared access!)
        - Process data (LLM calls, APIs, etc.)
        - Pure function - can be retried
        - Return results
        """
        return process_data(prep_res)

    def post(self, shared, prep_res, exec_res):
        """
        Phase 3: WRITE to shared store and route
        - Update shared context with results
        - Return action string for routing
        - Return None for default/end
        """
        shared["output"] = exec_res
        return "next_action"  # or None

Why this matters:

  • prep() and post() have context, exec() is pure
  • exec() can be retried independently (resilience!)
  • Clear separation of concerns

Graph Composition

# Chain nodes with default flow
node1 >> node2 >> node3

# Named actions for branching
decision_node >> ("approve", approval_node)
decision_node >> ("reject", rejection_node)

# Complex workflows
extract >> transform >> ("validate", validator)
validator >> ("success", loader)
validator >> ("failed", error_handler)

Shared Store Pattern

# Simple dictionary for context
shared = {
    "user_id": "123",
    "input": "Analyze this text",
    "history": []
}

# Nodes read and write to it
graph.run(shared)

# Results available after execution
print(shared["analysis_result"])

Key Features

Node Types

Type Use Case Example
Node Standard sync operations API calls, file I/O
AsyncNode I/O-bound async operations Concurrent API calls
BatchNode Process iterables Process 1000 records
ParallelBatchNode Concurrent batch processing Parallel data transforms
ValidatedNode Input/output validation Production pipelines
MetricsNode Performance tracking Monitoring, profiling

Production Features

  • โœ… Retry Logic: Built-in with max_retries and wait
  • โœ… Fallback Handling: exec_fallback() for graceful degradation
  • โœ… Validation: Input/output type checking
  • โœ… Metrics: Execution time, retry counts, success rates
  • โœ… Logging: Comprehensive debug support
  • โœ… Context Managers: Resource cleanup

Example Patterns

Agent Pattern

# Decision-making loop
think >> analyze >> ("use_tool", tool_node)
analyze >> ("respond", response_node)
tool_node >> think  # Loop back for reasoning

RAG Pattern

# Offline indexing
extract >> chunk >> embed >> store

# Online retrieval
query >> search >> rerank >> generate

Workflow Pattern

# Human-in-the-loop
process >> review >> ("approve", execute)
review >> ("reject", notify)
review >> ("modify", process)  # Loop back

Common Use Cases

I want to build... Start here Combine with
ChatGPT Clone chat-memory streaming-llm + chat-guardrail
Research Assistant agent rag + tool-search + agent-tools
Data Pipeline workflow batch + validated-pipeline
Multi-Agent System multi-agent supervisor + agent-memory
Production API production-ready-api metrics-dashboard + fault-tolerant

See workbooks/QUICK_FINDER.md for the complete list.


Scaffolding Tool

Generate production-ready boilerplate instantly:

# Generate a basic node
python scripts/kaygraph_scaffold.py node DataProcessor

# Generate an agent
python scripts/kaygraph_scaffold.py agent ResearchBot

# Generate a RAG system
python scripts/kaygraph_scaffold.py rag DocumentQA

# Generate a chat application
python scripts/kaygraph_scaffold.py chat CustomerSupport

# See all templates
python scripts/kaygraph_scaffold.py --help

Each template includes:

  • Complete working code
  • Documentation with TODOs
  • requirements.txt with optional dependencies
  • README with quickstart

Documentation

For Developers

For AI Coding Agents

For Quick Tasks


Why KayGraph?

The 500-Line Philosophy

KayGraph's core is intentionally 500 lines. This isn't a limitation - it's a feature.

Why?

  • โœ… You can read and understand the entire framework in one sitting
  • โœ… No hidden magic - just Python classes and composition
  • โœ… Easy to debug - it's just your code
  • โœ… No vendor lock-in - bring your own LLM, database, tools
  • โœ… Production-ready patterns without framework bloat

When humans can specify the graph, AI agents can automate it.

Zero Dependencies

The core framework has zero external dependencies. All examples that use LLMs, databases, or other services provide implementation templates - you bring your own tools.

This means:

  • ๐Ÿชถ Tiny install footprint
  • ๐Ÿ”ง Total control over your stack
  • ๐ŸŽฏ Only pay for what you use
  • ๐Ÿš€ No dependency hell

Project Structure

KayGraph/
โ”œโ”€โ”€ kaygraph/                      # Core framework (500 lines!)
โ”‚   โ””โ”€โ”€ __init__.py                # All abstractions in one file
โ”‚
โ”œโ”€โ”€ workbooks/                     # 70 production examples
โ”‚   โ”œโ”€โ”€ 01-getting-started/        # Start here
โ”‚   โ”œโ”€โ”€ 04-ai-agents/              # Agent patterns
โ”‚   โ”œโ”€โ”€ 09-rag-retrieval/          # RAG systems
โ”‚   โ””โ”€โ”€ ...                        # 13 more categories
โ”‚
โ”œโ”€โ”€ scripts/                       # Scaffolding tools
โ”‚   โ””โ”€โ”€ kaygraph_scaffold.py       # Generate boilerplate
โ”‚
โ”œโ”€โ”€ docs/                          # Comprehensive guides
โ”œโ”€โ”€ tests/                         # Unit tests
โ”‚
โ”œโ”€โ”€ LLM_CONTEXT_KAYGRAPH_DSL.md   # For coding agents
โ”œโ”€โ”€ CLAUDE.md                      # For developers
โ””โ”€โ”€ README.md                      # You are here

Testing & Quality

All 70 workbooks are validated for:

  • โœ… Valid structure (README.md + main.py)
  • โœ… Valid Python syntax
  • โœ… All imports resolve
  • โœ… 100% pass rate

Run validation yourself:

python tasks/workbook-testing/validate_all_workbooks.py

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Quick Contribution Ideas

  • ๐Ÿ› Fix bugs in examples
  • ๐Ÿ“ Improve documentation
  • ๐Ÿ’ก Add new workbook examples
  • ๐Ÿงช Expand test coverage
  • ๐ŸŽจ Enhance scaffolding templates

Community & Support


License

GNU Affero General Public License v3.0 (AGPL-3.0) - see LICENSE for details.

This means if you use KayGraph in your software or service, you must share your source code with your users. Perfect for keeping AI workflows open and collaborative!


Quick Reference Card

# Node Lifecycle
class MyNode(Node):
    def prep(self, shared):      # 1. Read context
        return data
    def exec(self, prep_res):    # 2. Execute (pure!)
        return result
    def post(self, shared, prep_res, exec_res):  # 3. Write & route
        shared["result"] = exec_res
        return "action"  # or None

# Graph Building
node1 >> node2                   # Default flow
node1 >> ("action", node2)       # Named action
node1 >> node2 >> node3          # Chain

# Running
graph = Graph(start_node)
shared = {"input": "data"}
graph.run(shared)
print(shared["output"])

Built with โค๏ธ by the KayOS Team

Ready to build? Start with workbooks/01-getting-started/kaygraph-hello-world

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kaygraph-0.3.2.tar.gz (1.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kaygraph-0.3.2-py3-none-any.whl (98.3 kB view details)

Uploaded Python 3

File details

Details for the file kaygraph-0.3.2.tar.gz.

File metadata

  • Download URL: kaygraph-0.3.2.tar.gz
  • Upload date:
  • Size: 1.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.10

File hashes

Hashes for kaygraph-0.3.2.tar.gz
Algorithm Hash digest
SHA256 9a18d46b36f0cec980105d679f9bc298e4221df227d58d4109e959c77a2fc7ba
MD5 edbe1f5c2cd315d1b7a9de46ad8d34c5
BLAKE2b-256 3e6f9b059bf33980d58095d2323712ff410236b0ddacfcb858e9d8ef308c17ca

See more details on using hashes here.

File details

Details for the file kaygraph-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: kaygraph-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 98.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.10

File hashes

Hashes for kaygraph-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 cf752c56a9d74a8a921f98db8130d170d613f225ace7a963860831f7dcf5fd54
MD5 f2e022b2eef46c8d03c398bfb246fcd5
BLAKE2b-256 245f815b7cb52d776124fdc3a65201fd7ee2979c0d9d2a990bf23c7d60b093a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page