Skip to main content

Official Hippocortex Python SDK — AI agent memory that learns from experience

Project description

hippocortex

Official Python SDK for Hippocortex, AI agent memory that learns from experience.

Install

pip install hippocortex

# With adapter support:
pip install hippocortex[openai-agents]  # OpenAI Agents SDK
pip install hippocortex[langgraph]      # LangGraph
pip install hippocortex[crewai]         # CrewAI
pip install hippocortex[autogen]        # AutoGen
pip install hippocortex[all]            # All adapters

Quick Start

Choose the integration method that fits your workflow:

Auto-Instrumentation (Easiest, 1 Line)

import hippocortex.auto
from openai import OpenAI

client = OpenAI()

# Every call now has persistent memory automatically:
# - Past context is synthesized and injected
# - The conversation is captured for future learning
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Deploy payments to staging"}]
)

wrap() (Recommended)

from hippocortex import wrap
from openai import OpenAI

# Wrap your client. Explicit, per-client control.
client = wrap(OpenAI())

# Use exactly as before. Memory is transparent.
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Deploy payments to staging"}]
)

# Works with Anthropic too:
from anthropic import Anthropic
client = wrap(Anthropic())

Manual Client (Advanced)

import asyncio
from hippocortex import Hippocortex
from hippocortex.types import CaptureEvent

async def main():
    hx = Hippocortex(api_key="hx_live_...")

    # 1. Capture agent events
    await hx.capture(CaptureEvent(
        type="message",
        session_id="sess-1",
        payload={"role": "user", "content": "Deploy payment service to staging"}
    ))

    # 2. Learn from experience
    await hx.learn()

    # 3. Synthesize context for decisions
    ctx = await hx.synthesize("deploy payment service")
    for entry in ctx.entries:
        print(f"[{entry.section}] {entry.content}")

asyncio.run(main())

Synchronous Client

from hippocortex import SyncHippocortex
from hippocortex.types import CaptureEvent

hx = SyncHippocortex(api_key="hx_live_...")

result = hx.capture(CaptureEvent(
    type="message",
    session_id="sess-1",
    payload={"role": "user", "content": "Hello"}
))
print(result.event_id)

Zero-Config

Both auto and wrap() resolve configuration automatically:

  1. Explicit arguments passed to wrap() or Hippocortex()
  2. Environment variables: HIPPOCORTEX_API_KEY, HIPPOCORTEX_BASE_URL
  3. .hippocortex.json file (searched from cwd upward)

.hippocortex.json

{
  "apiKey": "hx_live_your_key_here",
  "baseUrl": "https://api.hippocortex.dev/v1"
}

Auto-Memory Adapters

Wrap your agents with one line to get automatic memory capture and context injection.

OpenAI Agents SDK

from agents import Agent, Runner
from hippocortex import auto_memory

agent = Agent(name="assistant", instructions="You are helpful.")
agent = auto_memory(agent, api_key="hx_live_...")

# Or with env var HIPPOCORTEX_API_KEY:
agent = auto_memory(agent)

result = await Runner.run(agent, "Deploy to staging")

The adapter automatically:

  • Synthesizes past context before each run (injected into instructions)
  • Captures user messages, assistant responses, and tool calls
  • Gracefully degrades if the server is unreachable

LangGraph

from hippocortex.adapters import langgraph as hx_langgraph

# Wrap a compiled LangGraph
graph = builder.compile()
graph = hx_langgraph.wrap(graph, api_key="hx_live_...")

# Use normally — memory is automatic
result = await graph.ainvoke({"messages": [{"role": "user", "content": "deploy"}]})

Supports ainvoke, invoke, and astream.

CrewAI (Beta)

from hippocortex.adapters import crewai as hx_crewai

crew = hx_crewai.wrap(crew, api_key="hx_live_...")
result = crew.kickoff()

Injects memory context into agent backstories and captures task results.

AutoGen (Beta)

from hippocortex.adapters import autogen as hx_autogen

agent = hx_autogen.wrap(agent, api_key="hx_live_...")
agent.initiate_chat(other_agent, message="Hello")

Captures messages and injects synthesized context via reply hooks.

OpenClaw

from hippocortex.adapters import openclaw as hx_openclaw

middleware = hx_openclaw.create_middleware(api_key="hx_live_...")

# On incoming message:
context = await middleware.on_message("Deploy the service")
# context = "# Hippocortex Memory Context\n..."

# After generating response:
await middleware.on_response("Deployment complete!")

Configuration

All adapters support:

Option Env Var Default Description
api_key HIPPOCORTEX_API_KEY (required) API key
base_url HIPPOCORTEX_BASE_URL https://api.hippocortex.dev/v1 API URL
session_id auto-generated Session ID

Key Behaviors

  • Fire-and-forget capture: Capture calls never block the agent
  • Error swallowing: All Hippocortex errors are logged and swallowed — your agent never crashes because of memory
  • Graceful degradation: If the server is unreachable, adapters return empty context and continue
  • Session tracking: Auto-generated session IDs link related events

API Reference

Hippocortex(api_key, base_url?, timeout?)

Async client. Use with await.

  • capture(event)CaptureResult
  • capture_batch(events)BatchCaptureResult
  • learn(options?)LearnResult
  • synthesize(query, options?)SynthesizeResult
  • list_artifacts(...)ArtifactListResult
  • get_artifact(id)Artifact
  • get_metrics(...)MetricsResult

SyncHippocortex(api_key, base_url?, timeout?)

Synchronous client. Same methods, no await needed.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hippocortex-1.2.0.tar.gz (35.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hippocortex-1.2.0-py3-none-any.whl (34.1 kB view details)

Uploaded Python 3

File details

Details for the file hippocortex-1.2.0.tar.gz.

File metadata

  • Download URL: hippocortex-1.2.0.tar.gz
  • Upload date:
  • Size: 35.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for hippocortex-1.2.0.tar.gz
Algorithm Hash digest
SHA256 acea7770e322caf27f2dcfcf7e88750e2162d45e8756d02f8eca948acf7a22b5
MD5 1af2b7bcdd1aa0194978ac2f608ca643
BLAKE2b-256 241ffa73c77138759725111b173700d888d078a2bf3990f81439dbb2912e4537

See more details on using hashes here.

File details

Details for the file hippocortex-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: hippocortex-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 34.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for hippocortex-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fefb1d23ade29a0cd16146f0066872839e21c84b3bf8cb22c6dd7697c9e6e90c
MD5 6ef5c21b186bcbcf4df4057e9a86be28
BLAKE2b-256 8fec1c80d41b7b2c97d0e7b992c643bcd834c8718b794a26ac128234aabe46f5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page