Skip to main content

Hindsight: Agent Memory That Works Like Human Memory

Reason this release was yanked:

update to 0.5.6

Project description

Hindsight API

Memory System for AI Agents — Temporal + Semantic + Entity Memory Architecture using PostgreSQL with pgvector.

Hindsight gives AI agents persistent memory that works like human memory: it stores facts, tracks entities and relationships, handles temporal reasoning ("what happened last spring?"), and forms opinions based on configurable disposition traits.

Installation

pip install hindsight-api

Quick Start

Run the Server

# Set your LLM provider
export HINDSIGHT_API_LLM_PROVIDER=openai
export HINDSIGHT_API_LLM_API_KEY=sk-xxxxxxxxxxxx

# Start the server (uses embedded PostgreSQL by default)
hindsight-api

The server starts at http://localhost:8888 with:

  • REST API for memory operations
  • MCP server at /mcp for tool-use integration

Use the Python API

from hindsight_api import MemoryEngine

# Create and initialize the memory engine
memory = MemoryEngine()
await memory.initialize()

# Create a memory bank for your agent
bank = await memory.create_memory_bank(
    name="my-assistant",
    background="A helpful coding assistant"
)

# Store a memory
await memory.retain(
    memory_bank_id=bank.id,
    content="The user prefers Python for data science projects"
)

# Recall memories
results = await memory.recall(
    memory_bank_id=bank.id,
    query="What programming language does the user prefer?"
)

# Reflect with reasoning
response = await memory.reflect(
    memory_bank_id=bank.id,
    query="Should I recommend Python or R for this ML project?"
)

CLI Options

hindsight-api --help

# Common options
hindsight-api --port 9000          # Custom port (default: 8888)
hindsight-api --host 127.0.0.1     # Bind to localhost only
hindsight-api --workers 4          # Multiple worker processes
hindsight-api --log-level debug    # Verbose logging

Configuration

Configure via environment variables:

Variable Description Default
HINDSIGHT_API_DATABASE_URL PostgreSQL connection string pg0 (embedded)
HINDSIGHT_API_LLM_PROVIDER openai, anthropic, gemini, groq, ollama, lmstudio openai
HINDSIGHT_API_LLM_API_KEY API key for LLM provider -
HINDSIGHT_API_LLM_MODEL Model name gpt-4o-mini
HINDSIGHT_API_HOST Server bind address 0.0.0.0
HINDSIGHT_API_PORT Server port 8888

Example with External PostgreSQL

export HINDSIGHT_API_DATABASE_URL=postgresql://user:pass@localhost:5432/hindsight
export HINDSIGHT_API_LLM_PROVIDER=groq
export HINDSIGHT_API_LLM_API_KEY=gsk_xxxxxxxxxxxx

hindsight-api

Docker

docker run --rm -it -p 8888:8888 \
  -e HINDSIGHT_API_LLM_API_KEY=$OPENAI_API_KEY \
  -v $HOME/.hindsight-docker:/home/hindsight/.pg0 \
  ghcr.io/vectorize-io/hindsight:latest

MCP Server

For local MCP integration without running the full API server:

hindsight-local-mcp

This runs a stdio-based MCP server that can be used directly with MCP-compatible clients.

Key Features

  • Multi-Strategy Retrieval (TEMPR) — Semantic, keyword, graph, and temporal search combined with RRF fusion
  • Entity Graph — Automatic entity extraction and relationship tracking
  • Temporal Reasoning — Native support for time-based queries
  • Disposition Traits — Configurable skepticism, literalism, and empathy influence opinion formation
  • Three Memory Types — World facts, bank actions, and formed opinions with confidence scores

Documentation

Full documentation: https://hindsight.vectorize.io

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hindsight_api_slim-0.5.5.tar.gz (535.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hindsight_api_slim-0.5.5-py3-none-any.whl (645.0 kB view details)

Uploaded Python 3

File details

Details for the file hindsight_api_slim-0.5.5.tar.gz.

File metadata

  • Download URL: hindsight_api_slim-0.5.5.tar.gz
  • Upload date:
  • Size: 535.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hindsight_api_slim-0.5.5.tar.gz
Algorithm Hash digest
SHA256 7d540101f5d1c7ad46f11553ba528fe59449bcf6f5042cb3a56c115e520ded7e
MD5 ce29f5e046ba4307b658311ca8f1692c
BLAKE2b-256 f152093a8c164df844691d1b70d35b7bff1f3df5c111ad1166056cf4b17ab7e7

See more details on using hashes here.

Provenance

The following attestation bundles were made for hindsight_api_slim-0.5.5.tar.gz:

Publisher: release.yml on vectorize-io/hindsight

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hindsight_api_slim-0.5.5-py3-none-any.whl.

File metadata

File hashes

Hashes for hindsight_api_slim-0.5.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d366b05f0d63a9ce898a82c14ca8e72a6a484a6cf9e3014e0deca531c5dacf76
MD5 dbaf9acac5c068b5edbc89b39ebb0b82
BLAKE2b-256 52b9772c6f9ebe8ef5addd889ddc886e064a9ca15272e109a733f9e3bfe90985

See more details on using hashes here.

Provenance

The following attestation bundles were made for hindsight_api_slim-0.5.5-py3-none-any.whl:

Publisher: release.yml on vectorize-io/hindsight

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page