Skip to main content

An integration package connecting Postgres and LangChain

Project description

langchain-postgres

The langchain-postgres package implementations of core LangChain abstractions using Postgres.

The package is released under the MIT license.

Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.

Requirements

The package currently only supports the psycogp3 driver.

Installation

pip install -U langchain-postgres

Usage

PostgresSaver (LangGraph Checkpointer)

The LangGraph checkpointer can be used to add memory to your LangGraph application.

PostgresSaver is an implementation of the checkpointer saver using Postgres as the backend.

Currently, only the psycopg3 driver is supported.

Sync usage:

from psycopg_pool import ConnectionPool
from langchain_postgres import (
    PostgresSaver, PickleCheckpointSerializer
)

pool = ConnectionPool(
    # Example configuration
    conninfo="postgresql://langchain:langchain@localhost:6024/langchain",
    max_size=20,
)

PostgresSaver.create_tables(pool)

checkpointer = PostgresSaver(
    serializer=PickleCheckpointSerializer(),
    sync_connection=pool,
)

# Set up the langgraph workflow with the checkpointer
workflow = ... # Fill in with your workflow
app = workflow.compile(checkpointer=checkpointer)

# Use with the sync methods of `app` (e.g., `app.stream())

pool.close() # Remember to close the connection pool.

Async usage:

from psycopg_pool import AsyncConnectionPool
from langchain_postgres import (
    PostgresSaver, PickleCheckpointSerializer
)

pool = AsyncConnectionPool(
    # Example configuration
    conninfo="postgresql://langchain:langchain@localhost:6024/langchain",
    max_size=20,
)

# Create the tables in postgres (only needs to be done once)
await PostgresSaver.acreate_tables(pool)

checkpointer = PostgresSaver(
    serializer=PickleCheckpointSerializer(),
    async_connection=pool,
)

# Set up the langgraph workflow with the checkpointer
workflow = ... # Fill in with your workflow
app = workflow.compile(checkpointer=checkpointer)

# Use with the async methods of `app` (e.g., `app.astream()`)

await pool.close() # Remember to close the connection pool.

Testing

If testing with the postgres checkpointer it may be useful to both create and drop the tables before and after the tests.

from psycopg_pool import ConnectionPool
from langchain_postgres import (
    PostgresSaver 
)
with ConnectionPool(
    # Example configuration
    conninfo="postgresql://langchain:langchain@localhost:6024/langchain",
    max_size=20,
) as conn:
    PostgresSaver.create_tables(conn)
    PostgresSaver.drop_tables(conn)
    # Run your unit tests with langgraph

ChatMessageHistory

The chat message history abstraction helps to persist chat message history in a postgres table.

PostgresChatMessageHistory is parameterized using a table_name and a session_id.

The table_name is the name of the table in the database where the chat messages will be stored.

The session_id is a unique identifier for the chat session. It can be assigned by the caller using uuid.uuid4().

import uuid

from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg

# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)

# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)

session_id = str(uuid.uuid4())

# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
    table_name,
    session_id,
    sync_connection=sync_connection
)

# Add messages to the chat history
chat_history.add_messages([
    SystemMessage(content="Meow"),
    AIMessage(content="woof"),
    HumanMessage(content="bark"),
])

print(chat_history.messages)

Vectorstore

See example for the PGVector vectorstore here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_postgres-0.0.1.tar.gz (21.5 kB view details)

Uploaded Source

Built Distribution

langchain_postgres-0.0.1-py3-none-any.whl (22.4 kB view details)

Uploaded Python 3

File details

Details for the file langchain_postgres-0.0.1.tar.gz.

File metadata

  • Download URL: langchain_postgres-0.0.1.tar.gz
  • Upload date:
  • Size: 21.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for langchain_postgres-0.0.1.tar.gz
Algorithm Hash digest
SHA256 7188d5782c2e06c098591569970c7c18b09eb5d0566c0438e3f8d9a48d2ebbc5
MD5 240ee5af0655ecd57e3d07f72fce95a2
BLAKE2b-256 7ca03659f0a408ca67ea0ca2d5a0ca3b687763681eb828abb1b6378bee8a6f34

See more details on using hashes here.

Provenance

File details

Details for the file langchain_postgres-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_postgres-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e1ee0bd7a13cb2c495bc829c6e2813951cadbd4b0cfa1f3da17b560f7c4b2e7d
MD5 d8065a31d6e48ac5058cf1def6db1a31
BLAKE2b-256 23daea76179bf6a31256627872024131d42f6278e79f9f1baa76993c1004a55f

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page