Skip to main content

LangChain integrations for Google Cloud Bigtable

Project description

preview pypi versions

Quick Start

In order to use this library, you first need to go through the following steps:

  1. Select or create a Cloud Platform project.

  2. Enable billing for your project.

  3. Enable the Google Cloud Bigtable API.

  4. Setup Authentication.

Installation

Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.

With virtualenv, it’s possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.

Supported Python Versions

Python >= 3.9

Mac/Linux

pip install virtualenv
virtualenv <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install langchain-google-bigtable

Windows

pip install virtualenv
virtualenv <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install langchain-google-bigtable

Vector Store Usage

Use BigtableVectorStore to store documents and their vector embeddings, allowing you to search for the most similar or relevant documents from your database.

  • Full VectorStore Implementation: Supports all methods from the LangChain VectorStore abstract class.

  • Async/Sync Support: All methods are available in both asynchronous and synchronous versions.

  • Metadata Filtering: Supports filtering on metadata fields, including logical AND/OR combinations and filtering on document IDs with a specific prefix.

  • Multiple Distance Strategies: Supports both Cosine and Euclidean distance for similarity search.

  • Customizable Storage: Full control over how content, embeddings, and metadata are stored in Bigtable columns.

from langchain_google_bigtable import BigtableVectorStore, BigtableEngine

# Your embedding service and other configurations
# embedding_service = ...

engine = await BigtableEngine.async_initialize(project_id="your-project-id")
vector_store = await BigtableVectorStore.create(
    engine=engine,
    instance_id="your-instance-id",
    table_id="your-table-id",
    embedding_service=embedding_service,
    collection="your_collection_name",
)

See the full Vector Store tutorial.

Key-value Store Usage

Use BigtableByteStore for a key-value store in LangChain

  • ByteStore Interface: Follows LangChain’s ByteStore for string keys and byte values.

  • Sync/Async: Supports both synchronous and asynchronous operations.

  • BigtableEngine: Manages execution context.

from langchain_google_bigtable import BigtableByteStore, BigtableEngine

engine = await BigtableEngine.async_initialize(project_id="your-project-id")
store = await BigtableByteStore.create(
    engine=engine,
    instance_id="your-instance-id",
    table_id="your-table-id",
)
await store.amset([("key", b"value")])
retrieved = await store.amget(["key"])

See the full Key-value Store tutorial.

Document Loader Usage

Use a document loader to load data as LangChain Documents.

from langchain_google_bigtable import BigtableLoader


loader = BigtableLoader(
    instance_id="my-instance",
    table_id="my-table-name"
)
docs = loader.lazy_load()

See the full Document Loader tutorial.

Chat Message History Usage

Use ChatMessageHistory to store messages and provide conversation history to LLMs.

from langchain_google_bigtable import BigtableChatMessageHistory


history = BigtableChatMessageHistory(
    instance_id="my-instance",
    table_id="my-message-store",
    session_id="my-session_id"
)

See the full Chat Message History tutorial.

Contributions

Contributions to this library are always welcome and highly encouraged.

See CONTRIBUTING for more information how to get started.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Code of Conduct for more information.

License

Apache 2.0 - See LICENSE for more information.

Disclaimer

This is not an officially supported Google product.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_google_bigtable-0.8.0.tar.gz (63.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_google_bigtable-0.8.0-py3-none-any.whl (45.0 kB view details)

Uploaded Python 3

File details

Details for the file langchain_google_bigtable-0.8.0.tar.gz.

File metadata

File hashes

Hashes for langchain_google_bigtable-0.8.0.tar.gz
Algorithm Hash digest
SHA256 6b3d9543187fa95db587611591c900a75df5f156777876bff2d074ff695a076b
MD5 cfe1337e60d0ba2d3d0869ec83671b84
BLAKE2b-256 d0217a31dc8b87cd2601aa68315419a634721c30c989a708613886be83912e46

See more details on using hashes here.

File details

Details for the file langchain_google_bigtable-0.8.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_google_bigtable-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eb176cf130b8e5b94386fd4d1a9025f1dd0a24cae5b3f7a9d2a7c69cff34ef6f
MD5 712914f5ca036b9bf8ce22e6bc73285c
BLAKE2b-256 b5f4c44e31e3676731646fcab25dc7a274e909b8df5024d797544a7fce910d1e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page