Official Hippocortex Python SDK — AI agent memory that learns from experience
Project description
hippocortex
Official Python SDK for Hippocortex, AI agent memory that learns from experience.
Install
pip install hippocortex
# With adapter support:
pip install hippocortex[openai-agents] # OpenAI Agents SDK
pip install hippocortex[langgraph] # LangGraph
pip install hippocortex[crewai] # CrewAI
pip install hippocortex[autogen] # AutoGen
pip install hippocortex[all] # All adapters
Quick Start
Choose the integration method that fits your workflow:
Auto-Instrumentation (Easiest, 1 Line)
import hippocortex.auto
from openai import OpenAI
client = OpenAI()
# Every call now has persistent memory automatically:
# - Past context is synthesized and injected
# - The conversation is captured for future learning
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Deploy payments to staging"}]
)
wrap() (Recommended)
from hippocortex import wrap
from openai import OpenAI
# Wrap your client. Explicit, per-client control.
client = wrap(OpenAI())
# Use exactly as before. Memory is transparent.
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Deploy payments to staging"}]
)
# Works with Anthropic too:
from anthropic import Anthropic
client = wrap(Anthropic())
Manual Client (Advanced)
import asyncio
from hippocortex import Hippocortex
from hippocortex.types import CaptureEvent
async def main():
hx = Hippocortex(api_key="hx_live_...")
# 1. Capture agent events
await hx.capture(CaptureEvent(
type="message",
session_id="sess-1",
payload={"role": "user", "content": "Deploy payment service to staging"}
))
# 2. Learn from experience
await hx.learn()
# 3. Synthesize context for decisions
ctx = await hx.synthesize("deploy payment service")
for entry in ctx.entries:
print(f"[{entry.section}] {entry.content}")
asyncio.run(main())
Synchronous Client
from hippocortex import SyncHippocortex
from hippocortex.types import CaptureEvent
hx = SyncHippocortex(api_key="hx_live_...")
result = hx.capture(CaptureEvent(
type="message",
session_id="sess-1",
payload={"role": "user", "content": "Hello"}
))
print(result.event_id)
Zero-Config
Both auto and wrap() resolve configuration automatically:
- Explicit arguments passed to
wrap()orHippocortex() - Environment variables:
HIPPOCORTEX_API_KEY,HIPPOCORTEX_BASE_URL .hippocortex.jsonfile (searched from cwd upward)
.hippocortex.json
{
"apiKey": "hx_live_your_key_here",
"baseUrl": "https://api.hippocortex.dev/v1"
}
Auto-Memory Adapters
Wrap your agents with one line to get automatic memory capture and context injection.
OpenAI Agents SDK
from agents import Agent, Runner
from hippocortex import auto_memory
agent = Agent(name="assistant", instructions="You are helpful.")
agent = auto_memory(agent, api_key="hx_live_...")
# Or with env var HIPPOCORTEX_API_KEY:
agent = auto_memory(agent)
result = await Runner.run(agent, "Deploy to staging")
The adapter automatically:
- Synthesizes past context before each run (injected into instructions)
- Captures user messages, assistant responses, and tool calls
- Gracefully degrades if the server is unreachable
LangGraph
from hippocortex.adapters import langgraph as hx_langgraph
# Wrap a compiled LangGraph
graph = builder.compile()
graph = hx_langgraph.wrap(graph, api_key="hx_live_...")
# Use normally — memory is automatic
result = await graph.ainvoke({"messages": [{"role": "user", "content": "deploy"}]})
Supports ainvoke, invoke, and astream.
CrewAI (Beta)
from hippocortex.adapters import crewai as hx_crewai
crew = hx_crewai.wrap(crew, api_key="hx_live_...")
result = crew.kickoff()
Injects memory context into agent backstories and captures task results.
AutoGen (Beta)
from hippocortex.adapters import autogen as hx_autogen
agent = hx_autogen.wrap(agent, api_key="hx_live_...")
agent.initiate_chat(other_agent, message="Hello")
Captures messages and injects synthesized context via reply hooks.
OpenClaw
from hippocortex.adapters import openclaw as hx_openclaw
middleware = hx_openclaw.create_middleware(api_key="hx_live_...")
# On incoming message:
context = await middleware.on_message("Deploy the service")
# context = "# Hippocortex Memory Context\n..."
# After generating response:
await middleware.on_response("Deployment complete!")
Configuration
All adapters support:
| Option | Env Var | Default | Description |
|---|---|---|---|
api_key |
HIPPOCORTEX_API_KEY |
(required) | API key |
base_url |
HIPPOCORTEX_BASE_URL |
https://api.hippocortex.dev/v1 |
API URL |
session_id |
— | auto-generated | Session ID |
Key Behaviors
- Fire-and-forget capture: Capture calls never block the agent
- Error swallowing: All Hippocortex errors are logged and swallowed — your agent never crashes because of memory
- Graceful degradation: If the server is unreachable, adapters return empty context and continue
- Session tracking: Auto-generated session IDs link related events
API Reference
Hippocortex(api_key, base_url?, timeout?)
Async client. Use with await.
capture(event)→CaptureResultcapture_batch(events)→BatchCaptureResultlearn(options?)→LearnResultsynthesize(query, options?)→SynthesizeResultlist_artifacts(...)→ArtifactListResultget_artifact(id)→Artifactget_metrics(...)→MetricsResult
SyncHippocortex(api_key, base_url?, timeout?)
Synchronous client. Same methods, no await needed.
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hippocortex-1.2.1.tar.gz.
File metadata
- Download URL: hippocortex-1.2.1.tar.gz
- Upload date:
- Size: 37.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ebd45bab3001bdbbbc96fc273b69df939549be10aed191e96081bd9de76f756d
|
|
| MD5 |
9309aca471550b92954d50502d01961a
|
|
| BLAKE2b-256 |
f67c1127b47c8011b89814f59537f6f256df27c5cd799f5e2217f1a7fc3d32cd
|
File details
Details for the file hippocortex-1.2.1-py3-none-any.whl.
File metadata
- Download URL: hippocortex-1.2.1-py3-none-any.whl
- Upload date:
- Size: 34.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
259cbf3faa9b5ec10c7c9e54a1cd905aed304cce67f49181cf62086e74852039
|
|
| MD5 |
b4528e4e094c96e51385abcea3620ae0
|
|
| BLAKE2b-256 |
67e28ca9e53674c162a98195f722e9ae9170cd58d8a2eee15553d535def17771
|