An integration package connecting Postgres and LangChain
Project description
langchain-postgres
The langchain-postgres package implementations of core LangChain abstractions using Postgres.
The package is released under the MIT license.
Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.
Requirements
The package supports the asyncpg and psycopg3 drivers.
Installation
pip install -U langchain-postgres
Vectorstore
[!WARNING] In v0.0.14+,
PGVectoris deprecated. Please migrate toPGVectorStorefor improved performance and manageability. See the migration guide for details on how to migrate fromPGVectortoPGVectorStore.
Documentation
Example
from langchain_core.documents import Document
from langchain_core.embeddings import DeterministicFakeEmbedding
from langchain_postgres import PGEngine, PGVectorStore
# Replace the connection string with your own Postgres connection string
CONNECTION_STRING = "postgresql+psycopg3://langchain:langchain@localhost:6024/langchain"
engine = PGEngine.from_connection_string(url=CONNECTION_STRING)
# Replace the vector size with your own vector size
VECTOR_SIZE = 768
embedding = DeterministicFakeEmbedding(size=VECTOR_SIZE)
TABLE_NAME = "my_doc_collection"
engine.init_vectorstore_table(
table_name=TABLE_NAME,
vector_size=VECTOR_SIZE,
)
store = PGVectorStore.create_sync(
engine=engine,
table_name=TABLE_NAME,
embedding_service=embedding,
)
docs = [
Document(page_content="Apples and oranges"),
Document(page_content="Cars and airplanes"),
Document(page_content="Train")
]
store.add_documents(docs)
query = "I'd like a fruit."
docs = store.similarity_search(query)
print(docs)
[!TIP] All synchronous functions have corresponding asynchronous functions
Hybrid Search with PGVectorStore
With PGVectorStore you can use hybrid search for more comprehensive and relevant search results.
vs = PGVectorStore.create_sync(
engine=engine,
table_name=TABLE_NAME,
embedding_service=embedding,
hybrid_search_config=HybridSearchConfig(
fusion_function=reciprocal_rank_fusion
),
)
hybrid_docs = vector_store.similarity_search("products", k=5)
For a detailed guide on how to use hybrid search, see the documentation.
ChatMessageHistory
The chat message history abstraction helps to persist chat message history in a postgres table.
PostgresChatMessageHistory is parameterized using a table_name and a session_id.
The table_name is the name of the table in the database where
the chat messages will be stored.
The session_id is a unique identifier for the chat session. It can be assigned
by the caller using uuid.uuid4().
import uuid
from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg
# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)
# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)
session_id = str(uuid.uuid4())
# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
table_name,
session_id,
sync_connection=sync_connection
)
# Add messages to the chat history
chat_history.add_messages([
SystemMessage(content="Meow"),
AIMessage(content="woof"),
HumanMessage(content="bark"),
])
print(chat_history.messages)
Google Cloud Integrations
Google Cloud provides Vector Store, Chat Message History, and Data Loader integrations for AlloyDB and Cloud SQL for PostgreSQL databases via the following PyPi packages:
Using the Google Cloud integrations provides the following benefits:
- Enhanced Security: Securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
- Simplified and Secure Connections: Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the
engineobject.
| Vector Store | Metadata filtering | Async support | Schema Flexibility | Improved metadata handling | Hybrid Search |
|---|---|---|---|---|---|
| Google AlloyDB | ✓ | ✓ | ✓ | ✓ | ✗ |
| Google Cloud SQL Postgres | ✓ | ✓ | ✓ | ✓ | ✗ |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_postgres-0.0.16.tar.gz.
File metadata
- Download URL: langchain_postgres-0.0.16.tar.gz
- Upload date:
- Size: 232.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d09aa4ea77ee8600a9ff64de9c185fb558aa388c816c7be04dd4559c878530b7
|
|
| MD5 |
74f050c1d4e404ea16b82a1e3c12d0cf
|
|
| BLAKE2b-256 |
6f5e00065782aa0ad7b5faa9ff6881bcf361f2a7741e39db8e2b3e86164f80c8
|
File details
Details for the file langchain_postgres-0.0.16-py3-none-any.whl.
File metadata
- Download URL: langchain_postgres-0.0.16-py3-none-any.whl
- Upload date:
- Size: 46.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a7375cf9fc9b6965efc207dbcc959424e96b8ffe75d5ced6055676d2613f8d37
|
|
| MD5 |
05b7db0e050b4abfc4a4f86a8cdc0a3b
|
|
| BLAKE2b-256 |
5aa2516934f8be231e50bb2afda8112641850154b057f5c45d82f42d216bed3d
|