Skip to main content

Python SDK for Kagura Memory Cloud — AI-driven memory management

Project description

Kagura Memory SDK

AI-driven memory management for Kagura Memory Cloud.

Installation

pip install kagura-memory
# or
uv add kagura-memory

For development:

git clone https://github.com/kagura-ai/kagura-memory-python-sdk.git
cd kagura-memory-python-sdk
uv sync --dev

Quick Start

Python SDK

from kagura_memory import KaguraAgent, Session, Message

# Initialize agent
agent = KaguraAgent(
    api_key="your_kagura_api_key",
    model="gpt-5.4-nano",
)

# Create a session
session = Session(
    messages=[
        Message(role="user", content="FastAPIでOAuth2を実装したい"),
        Message(role="assistant", content="Authlibを使うパターンが推奨です..."),
        Message(role="user", content="なるほど、これ覚えておいて"),
    ]
)

# Process the session (AI automatically decides what to remember/recall)
result = await agent.process(session, verbose=2)

print(f"Remembered: {len(result.remembered)}")
print(f"Recalled: {len(result.recalled)}")
print(f"Explored: {len(result.explored)}")
print(f"Context: {result.context_used}")
if result.llm_usage:
    print(f"Tokens: {result.llm_usage.total_tokens}")

CLI

Configuration

Create .kagura.json:

{
  "mcp_url": "https://memory.kagura-ai.com/mcp",
  "api_key": "your_kagura_api_key",
  "model": "gpt-5.4-nano",
  "context_id": "dev",
  "llm_api_key": "your_openai_or_anthropic_api_key"
}

Note on LLM API Keys:

  • llm_api_key in .kagura.json is optional
  • If not provided, LiteLLM will use standard environment variables:
    • OpenAI: OPENAI_API_KEY
    • Claude: ANTHROPIC_API_KEY
    • Gemini: GEMINI_API_KEY

Or use environment variables:

export KAGURA_API_KEY="your_kagura_api_key"
export KAGURA_MCP_URL="https://memory.kagura-ai.com/mcp"
export KAGURA_MODEL="gpt-5.4-nano"
export OPENAI_API_KEY="your_openai_key"  # For LLM

Usage

# AI-powered processing (auto-decides what to remember/recall)
kagura process -m "Remember: FastAPI uses Depends() for DI"
kagura process -m "FastAPIの実装パターンを探して" --deep
kagura process -m "OAuth2について教えて" -vv  # verbose

# Direct memory operations (no LLM required)
kagura remember -s "FastAPI DI pattern" --content "Use Depends()..."
kagura remember -c dev -s "OAuth2 setup" --content "..." --tags "auth,oauth"

kagura recall "FastAPI dependency injection"
kagura recall "OAuth2 implementation" -k 10

kagura explore -m "memory-uuid-here" --depth 3
kagura reference -m "memory-uuid-here"

# Delete memories (soft delete, 30-day recovery)
kagura forget -m "memory-uuid-here"
kagura forget -q "outdated test data" -k 5

# List available contexts
kagura contexts

# Show current config
kagura config show

Claude Code Integration

You can use Kagura Memory as an MCP server in Claude Code. Copy .mcp.json.example to .mcp.json and fill in your credentials:

cp .mcp.json.example .mcp.json
# Edit .mcp.json with your workspace ID and API key

Or use the CLI via Bash:

# In Claude Code, use Bash tool:
kagura process -m "今日の学び:FastAPIのDIはDepends()を使う"

Features

Current Version (0.2.2)

  • LLM-Powered Analysis: Automatically decides what to remember/recall
  • Session-Based Input: Messages + artifacts (code, documents, errors)
  • Deep Mode (deep=True): Neural Memory graph exploration
  • Verbose Logging (0-3): Silent to debug with Rich panels
  • Context Auto-Selection (context_id="auto"): LLM selects best context
  • Multiple LLM Support: OpenAI, Claude, Gemini, Ollama via LiteLLM
  • Type Safety: Full Pydantic validation
  • CLI Commands: Full suite of commands for AI and direct operations
  • Graceful Degradation: Continues even if LLM fails

New in v0.2.2 (Phase 3 - CLI)

  • Direct CLI Commands: kagura remember, kagura recall, kagura forget, kagura explore, kagura reference, kagura contexts
  • No LLM Required: Direct memory operations without AI analysis
  • Flexible Context: Use --context-id or configure in .kagura.json

v0.2.1 (Phase 2.5)

  • Dynamic Tool Definitions: Fetches MCP tool specifications via tools/list
  • Enhanced Prompts: LLM receives actual parameter schemas and context info
  • Intelligent Caching: 5-minute TTL cache for tool/context definitions
  • Automatic Fallback: Uses static prompts if dynamic fetching fails

Supported LLM Models

Via LiteLLM:

# OpenAI
agent = KaguraAgent(api_key="...", model="gpt-5.4-nano")

# Claude
agent = KaguraAgent(api_key="...", model="claude-sonnet-4-20250514")

# Gemini
agent = KaguraAgent(api_key="...", model="gemini/gemini-1.5-flash")

# Ollama (local)
agent = KaguraAgent(api_key="...", model="ollama/llama3")

Development

Setup

uv sync --dev

Quality Checks

uv run ruff check src/ tests/   # Lint
uv run ruff format src/ tests/  # Format
uv run pyright src/              # Type check
uv run pytest tests/ -v          # Test

Links

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kagura_memory-0.2.3.tar.gz (152.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kagura_memory-0.2.3-py3-none-any.whl (23.0 kB view details)

Uploaded Python 3

File details

Details for the file kagura_memory-0.2.3.tar.gz.

File metadata

  • Download URL: kagura_memory-0.2.3.tar.gz
  • Upload date:
  • Size: 152.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kagura_memory-0.2.3.tar.gz
Algorithm Hash digest
SHA256 411f0bc666b9bdf26d8e630d6758d696cb525a76d95ce15daf90a9a9dd3f9803
MD5 351fb185334c0363e76d995ce7f1b8e2
BLAKE2b-256 9a95b0cd032bbf36a7edbb2649093d6338c7f1a399a5f6876d9d335301e8d390

See more details on using hashes here.

Provenance

The following attestation bundles were made for kagura_memory-0.2.3.tar.gz:

Publisher: publish.yml on kagura-ai/kagura-memory-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kagura_memory-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: kagura_memory-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 23.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kagura_memory-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 c2a935205ea7731e391a00603d584bda00f4e37e56c23b50f763935de74a2393
MD5 6db823784bb2643fd13e01821564a280
BLAKE2b-256 d3c673b7cc7dce093ac84023fe61d58b3ac4424b2ee5c39d227df8acfb532295

See more details on using hashes here.

Provenance

The following attestation bundles were made for kagura_memory-0.2.3-py3-none-any.whl:

Publisher: publish.yml on kagura-ai/kagura-memory-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page