Skip to main content

Knowledge graph memory for LangGraph agents. Developer-controlled, zero LLM cost, time-travel debugging.

Project description

hyperstack-langgraph

Knowledge graph memory for LangGraph agents. Developer-controlled, zero LLM cost, time-travel debugging.

Install

pip install hyperstack-langgraph langchain-core langgraph

Quick Start (3 lines)

from hyperstack_langgraph import create_memory_agent
from langchain_openai import ChatOpenAI

agent = create_memory_agent(ChatOpenAI(model="gpt-4o"))

That's it. Your agent now has persistent knowledge graph memory. It will:

  • Search memory at the start of every conversation
  • Store important facts when decisions are made (with user confirmation)
  • Traverse the graph to answer "what depends on X?" or "who decided Y?"

Environment Variables

export HYPERSTACK_API_KEY=hs_your_key    # Get free at cascadeai.dev/hyperstack
export HYPERSTACK_WORKSPACE=default
export OPENAI_API_KEY=sk-...             # For your LLM

Usage: Add Memory Tools to Existing Agent

from hyperstack_langgraph import create_hyperstack_tools
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

# Create memory tools
memory_tools = create_hyperstack_tools()

# Add to your existing tools
my_tools = [my_calculator, my_web_search] + memory_tools

# Create agent with memory
agent = create_react_agent(ChatOpenAI(model="gpt-4o"), my_tools)

# Use it
result = agent.invoke(
    {"messages": [{"role": "user", "content": "What do we know about our auth setup?"}]},
    config={"configurable": {"thread_id": "session-1"}}
)

Usage: Full Agent with Session Memory

from hyperstack_langgraph import create_memory_agent
from langchain_openai import ChatOpenAI
from langgraph.checkpoint.memory import MemorySaver

agent = create_memory_agent(
    ChatOpenAI(model="gpt-4o"),
    checkpointer=MemorySaver(),  # Session memory (optional)
)

config = {"configurable": {"thread_id": "project-alpha"}}

# First message — agent searches HyperStack for context
agent.invoke({"messages": [{"role": "user", "content": "Let's work on the auth system"}]}, config)

# Agent remembers within session (MemorySaver) AND across sessions (HyperStack)
agent.invoke({"messages": [{"role": "user", "content": "We decided to use Clerk"}]}, config)

Usage: Direct API Client

from hyperstack_langgraph import HyperStackClient

client = HyperStackClient()

# Store
client.store("use-clerk", "Use Clerk for Auth", "Chose Clerk over Auth0",
             card_type="decision", keywords=["clerk", "auth"],
             links=[{"target": "alice", "relation": "decided"}])

# Search
client.search("authentication")

# Graph traversal
client.graph("use-clerk", depth=2)

# Time-travel (Pro+)
client.graph("use-clerk", depth=2, at="2026-02-01T00:00:00Z")

Tools Provided

Tool Description
hyperstack_search Search memory for relevant context
hyperstack_store Save a fact, decision, preference, or person
hyperstack_graph Traverse knowledge graph (impact analysis, decision trails)
hyperstack_list List all stored memories
hyperstack_delete Remove outdated memories

Why HyperStack?

  • You control the graph. No LLM auto-extraction. No phantom relationships. Your agent explicitly defines cards and links.
  • Zero LLM cost per memory op. Mem0/Zep charge ~$0.002 per operation. HyperStack: $0.
  • Time-travel debugging. See the graph as it existed at any point in time. "Git blame for agent memory."
  • 30-second setup. No Neo4j, no Docker, no OpenSearch. One API key, done.

Pricing

Plan Cards Graph Price
Free 10 $0
Pro 100 ✅ + time-travel $29/mo
Team 500 $59/mo
Business 2,000 $149/mo

Get a free API key at cascadeai.dev/hyperstack

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperstack_langgraph-1.0.0.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperstack_langgraph-1.0.0-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file hyperstack_langgraph-1.0.0.tar.gz.

File metadata

  • Download URL: hyperstack_langgraph-1.0.0.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for hyperstack_langgraph-1.0.0.tar.gz
Algorithm Hash digest
SHA256 8d21ce4463e0580cf64a76b8d701fb3ab52526abf16fb4611766998ee010df22
MD5 285c65a161f7d7aab488d60d72561a86
BLAKE2b-256 d7fb650152cdf2cc29c8b2ab1a6a9cfaf1812e5686822d5b8c79262812933cc8

See more details on using hashes here.

File details

Details for the file hyperstack_langgraph-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for hyperstack_langgraph-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5eb947dc8b80cd85c63293fa591c5db0b51afb7055f251533f55794f32762174
MD5 567f33bf369295c7749cbc325cbd436b
BLAKE2b-256 66468d9ed1c6b4c756f3ec80e07459099ae23eb22bde649184e74c13bd8f72db

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page