Skip to main content

Hindsight: Agent Memory That Works Like Human Memory

Project description

Hindsight API

Memory System for AI Agents — Temporal + Semantic + Entity Memory Architecture using PostgreSQL with pgvector.

Hindsight gives AI agents persistent memory that works like human memory: it stores facts, tracks entities and relationships, handles temporal reasoning ("what happened last spring?"), and forms opinions based on configurable disposition traits.

Installation

pip install hindsight-api

Quick Start

Run the Server

# Set your LLM provider
export HINDSIGHT_API_LLM_PROVIDER=openai
export HINDSIGHT_API_LLM_API_KEY=sk-xxxxxxxxxxxx

# Start the server (uses embedded PostgreSQL by default)
hindsight-api

The server starts at http://localhost:8888 with:

  • REST API for memory operations
  • MCP server at /mcp for tool-use integration

Use the Python API

from hindsight_api import MemoryEngine

# Create and initialize the memory engine
memory = MemoryEngine()
await memory.initialize()

# Create a memory bank for your agent
bank = await memory.create_memory_bank(
    name="my-assistant",
    background="A helpful coding assistant"
)

# Store a memory
await memory.retain(
    memory_bank_id=bank.id,
    content="The user prefers Python for data science projects"
)

# Recall memories
results = await memory.recall(
    memory_bank_id=bank.id,
    query="What programming language does the user prefer?"
)

# Reflect with reasoning
response = await memory.reflect(
    memory_bank_id=bank.id,
    query="Should I recommend Python or R for this ML project?"
)

CLI Options

hindsight-api --help

# Common options
hindsight-api --port 9000          # Custom port (default: 8888)
hindsight-api --host 127.0.0.1     # Bind to localhost only
hindsight-api --workers 4          # Multiple worker processes
hindsight-api --log-level debug    # Verbose logging

Configuration

Configure via environment variables:

Variable Description Default
HINDSIGHT_API_DATABASE_URL PostgreSQL connection string pg0 (embedded)
HINDSIGHT_API_LLM_PROVIDER openai, anthropic, gemini, groq, ollama, lmstudio openai
HINDSIGHT_API_LLM_API_KEY API key for LLM provider -
HINDSIGHT_API_LLM_MODEL Model name gpt-4o-mini
HINDSIGHT_API_HOST Server bind address 0.0.0.0
HINDSIGHT_API_PORT Server port 8888

Example with External PostgreSQL

export HINDSIGHT_API_DATABASE_URL=postgresql://user:pass@localhost:5432/hindsight
export HINDSIGHT_API_LLM_PROVIDER=groq
export HINDSIGHT_API_LLM_API_KEY=gsk_xxxxxxxxxxxx

hindsight-api

Docker

docker run --rm -it -p 8888:8888 \
  -e HINDSIGHT_API_LLM_API_KEY=$OPENAI_API_KEY \
  -v $HOME/.hindsight-docker:/home/hindsight/.pg0 \
  ghcr.io/vectorize-io/hindsight:latest

MCP Server

For local MCP integration without running the full API server:

hindsight-local-mcp

This runs a stdio-based MCP server that can be used directly with MCP-compatible clients.

Key Features

  • Multi-Strategy Retrieval (TEMPR) — Semantic, keyword, graph, and temporal search combined with RRF fusion
  • Entity Graph — Automatic entity extraction and relationship tracking
  • Temporal Reasoning — Native support for time-based queries
  • Disposition Traits — Configurable skepticism, literalism, and empathy influence opinion formation
  • Three Memory Types — World facts, bank actions, and formed opinions with confidence scores

Documentation

Full documentation: https://hindsight.vectorize.io

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hindsight_api_slim-0.5.0.tar.gz (474.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hindsight_api_slim-0.5.0-py3-none-any.whl (577.7 kB view details)

Uploaded Python 3

File details

Details for the file hindsight_api_slim-0.5.0.tar.gz.

File metadata

  • Download URL: hindsight_api_slim-0.5.0.tar.gz
  • Upload date:
  • Size: 474.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hindsight_api_slim-0.5.0.tar.gz
Algorithm Hash digest
SHA256 959e6d075d00409b9ccd93d12d94d0c87f761ff5eee2a2592078ce3985931898
MD5 fa46ded41dca54690b6a9b0e16fe3397
BLAKE2b-256 7b6a4df8add45dd5ed89ee58a3e04ee7d4756700b4a1044b57ed1f0866b1f0fa

See more details on using hashes here.

Provenance

The following attestation bundles were made for hindsight_api_slim-0.5.0.tar.gz:

Publisher: release.yml on vectorize-io/hindsight

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hindsight_api_slim-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for hindsight_api_slim-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 18c40040d297aad289d6809b26107b6c4a5330366eb299a4666a5ded3261a1fc
MD5 99771ad1f56b2cba5d37c6a1937242d1
BLAKE2b-256 231c182ec48635ca1e8c428b0338541f2f748a71b12bcd84db7a0362425512f8

See more details on using hashes here.

Provenance

The following attestation bundles were made for hindsight_api_slim-0.5.0-py3-none-any.whl:

Publisher: release.yml on vectorize-io/hindsight

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page