Skip to main content

LangChain integrations for Google Cloud Memorystore

Project description

preview pypi versions

Quick Start

In order to use this library, you first need to go through the following steps:

  1. Select or create a Cloud Platform project.

  2. Enable billing for your project.

  3. Enable the Google Memorystore for Redis API.

  4. Setup Authentication.

Installation

Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.

With virtualenv, it’s possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.

Supported Python Versions

Python >= 3.8

Mac/Linux

pip install virtualenv
virtualenv <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install langchain-google-memorystore-redis

Windows

pip install virtualenv
virtualenv <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install langchain-google-memorystore-redis

Vector Store Usage

Use a vector store to store embedded data and perform vector search.

from langchain_google_memorystore_redis import RedisVectorStore

redis_client = redis.from_url("redis://127.0.0.1:6379")

embeddings_service = VertexAIEmbeddings(model_name="textembedding-gecko@003")
vectorstore = RedisVectorStore(
    client=redis_client,
    index_name="my_vector_index",
    embeddings=embeddings_service
)

You can also use a clustered client:

from langchain_google_memorystore_redis import RedisVectorStore

redis_client = redis.cluster.RedisCluster.from_url("redis://127.0.0.1:6379")

embeddings_service = VertexAIEmbeddings(model_name="textembedding-gecko@003")
vectorstore = RedisVectorStore(
    client=redis_client,
    index_name="my_vector_index",
    embeddings=embeddings_service
)

See the full Vector Store tutorial.

Document Loader Usage

Use a document loader to load data as LangChain Documents.

from langchain_google_memorystore_redis import MemorystoreDocumentLoader


loader = MemorystoreDocumentLoader(
    client=redis_client,
    key_prefix="docs:",
    content_fields=set(["page_content"]),
)
docs = loader.lazy_load()

See the full Document Loader tutorial.

Chat Message History Usage

Use ChatMessageHistory to store messages and provide conversation history to LLMs.

from langchain_google_memorystore_redis import MemorystoreChatMessageHistory


history = MemorystoreChatMessageHistory(
    client=redis_client,
    session_id="my-session_id"
)

See the full Chat Message History tutorial.

Contributions

Contributions to this library are always welcome and highly encouraged.

See CONTRIBUTING for more information how to get started.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Code of Conduct for more information.

License

Apache 2.0 - See LICENSE for more information.

Disclaimer

This is not an officially supported Google product.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file langchain_google_memorystore_redis-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_google_memorystore_redis-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fecafb99d5bd23b03da042bdfd75447b057c461aea858fc735345892cf41566c
MD5 2c016a45ed5f6c05e7994c0ca10fae53
BLAKE2b-256 0a63f3d63dab2abe8247334c3da74395dad248a60ab17d905926995007caea97

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page