Skip to main content

An integration package connecting Postgres and LangChain

Project description

langchain-postgres

Release Notes CI License: MIT Twitter Open Issues

The langchain-postgres package implementations of core LangChain abstractions using Postgres.

The package is released under the MIT license.

Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.

Requirements

The package supports the asyncpg and psycogp3 drivers.

Installation

pip install -U langchain-postgres

Usage

Vectorstore

[!WARNING] In v0.0.14+, PGVector is deprecated. Please migrate to PGVectorStore Version 0.0.14+ has not been released yet, but you can test version of the vectorstore on the main branch. Until official release do not use in production. for improved performance and manageability. See the migration guide for details on how to migrate from PGVector to PGVectorStore.

For a detailed example on PGVectorStore see here.

from langchain_core.documents import Document
from langchain_core.embeddings import DeterministicFakeEmbedding
from langchain_postgres import PGEngine, PGVectorStore

# Replace the connection string with your own Postgres connection string
CONNECTION_STRING = "postgresql+psycopg3://langchain:langchain@localhost:6024/langchain"
engine = PGEngine.from_connection_string(url=CONNECTION_STRING)

# Replace the vector size with your own vector size
VECTOR_SIZE = 768
embedding = DeterministicFakeEmbedding(size=VECTOR_SIZE)

TABLE_NAME = "my_doc_collection"

engine.init_vectorstore_table(
    table_name=TABLE_NAME,
    vector_size=VECTOR_SIZE,
)

store = PGVectorStore.create_sync(
    engine=engine,
    table_name=TABLE_NAME,
    embedding_service=embedding,
)

docs = [
    Document(page_content="Apples and oranges"),
    Document(page_content="Cars and airplanes"),
    Document(page_content="Train")
]

store.add_documents(docs)

query = "I'd like a fruit."
docs = store.similarity_search(query)
print(docs)

[!TIP] All synchronous functions have corresponding asynchronous functions

ChatMessageHistory

The chat message history abstraction helps to persist chat message history in a postgres table.

PostgresChatMessageHistory is parameterized using a table_name and a session_id.

The table_name is the name of the table in the database where the chat messages will be stored.

The session_id is a unique identifier for the chat session. It can be assigned by the caller using uuid.uuid4().

import uuid

from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg

# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)

# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)

session_id = str(uuid.uuid4())

# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
    table_name,
    session_id,
    sync_connection=sync_connection
)

# Add messages to the chat history
chat_history.add_messages([
    SystemMessage(content="Meow"),
    AIMessage(content="woof"),
    HumanMessage(content="bark"),
])

print(chat_history.messages)

Google Cloud Integrations

Google Cloud provides Vector Store, Chat Message History, and Data Loader integrations for AlloyDB and Cloud SQL for PostgreSQL databases via the following PyPi packages:

Using the Google Cloud integrations provides the following benefits:

  • Enhanced Security: Securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
  • Simplified and Secure Connections: Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the engine object.
Vector Store Metadata filtering Async support Schema Flexibility Improved metadata handling Hybrid Search
Google AlloyDB
Google Cloud SQL Postgres

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_postgres-0.0.14rc1.tar.gz (40.0 kB view details)

Uploaded Source

Built Distribution

langchain_postgres-0.0.14rc1-py3-none-any.whl (42.2 kB view details)

Uploaded Python 3

File details

Details for the file langchain_postgres-0.0.14rc1.tar.gz.

File metadata

  • Download URL: langchain_postgres-0.0.14rc1.tar.gz
  • Upload date:
  • Size: 40.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for langchain_postgres-0.0.14rc1.tar.gz
Algorithm Hash digest
SHA256 8993155bab34d8dd8e5a97d8658992f1113b1806b8244cc795309b3943cd4c4a
MD5 17039350c424a65c65f27feb3dca4a85
BLAKE2b-256 81e6d6567b67c1ca4b7b8195f2e4e77f324718cc07f7c6a29847f029e8010011

See more details on using hashes here.

File details

Details for the file langchain_postgres-0.0.14rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_postgres-0.0.14rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 02920be4dcbb5fb3e0aa2805579daca608976b38d7209b71cb1bd92b0cd9b244
MD5 c105892699c57dcaf57dc36bb2653fdc
BLAKE2b-256 e6ef7bd45454d1a757b3618502d1ee4604ea451ee676f1e90a4297d9d16c0f33

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page