Skip to main content

An integration package connecting Google's genai package and LangChain

Project description

langchain-google-genai

This package contains the LangChain integrations for Gemini through their generative-ai SDK.

Installation

pip install -U langchain-google-genai

Chat Models

This package contains the ChatGoogleGenerativeAI class, which is the recommended way to interface with the Google Gemini series of models.

To use, install the requirements, and configure your environment.

export GOOGLE_API_KEY=your-api-key

Then initialize

from langchain_google_genai import ChatGoogleGenerativeAI

llm = ChatGoogleGenerativeAI(model="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")

Multimodal inputs

Gemini vision model supports image inputs when providing a single chat message. Example:

from langchain_core.messages import HumanMessage
from langchain_google_genai import ChatGoogleGenerativeAI

llm = ChatGoogleGenerativeAI(model="gemini-pro-vision")
# example
message = HumanMessage(
    content=[
        {
            "type": "text",
            "text": "What's in this image?",
        },  # You can optionally provide text parts
        {"type": "image_url", "image_url": "https://picsum.photos/seed/picsum/200/300"},
    ]
)
llm.invoke([message])

The value of image_url can be any of the following:

  • A public image URL
  • An accessible gcs file (e.g., "gcs://path/to/file.png")
  • A base64 encoded image (e.g., data:image/png;base64,abcd124)

Embeddings

This package also adds support for google's embeddings models.

from langchain_google_genai import GoogleGenerativeAIEmbeddings

embeddings = GoogleGenerativeAIEmbeddings(model="models/embedding-001")
embeddings.embed_query("hello, world!")

Semantic Retrieval

Enables retrieval augmented generation (RAG) in your application.

# Create a new store for housing your documents.
corpus_store = GoogleVectorStore.create_corpus(display_name="My Corpus")

# Create a new document under the above corpus.
document_store = GoogleVectorStore.create_document(
    corpus_id=corpus_store.corpus_id, display_name="My Document"
)

# Upload some texts to the document.
text_splitter = CharacterTextSplitter(chunk_size=500, chunk_overlap=0)
for file in DirectoryLoader(path="data/").load():
    documents = text_splitter.split_documents([file])
    document_store.add_documents(documents)

# Talk to your entire corpus with possibly many documents. 
aqa = corpus_store.as_aqa()
answer = aqa.invoke("What is the meaning of life?")

# Read the response along with the attributed passages and answerability.
print(response.answer)
print(response.attributed_passages)
print(response.answerable_probability)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_google_genai-2.0.5.tar.gz (37.0 kB view details)

Uploaded Source

Built Distribution

langchain_google_genai-2.0.5-py3-none-any.whl (41.3 kB view details)

Uploaded Python 3

File details

Details for the file langchain_google_genai-2.0.5.tar.gz.

File metadata

File hashes

Hashes for langchain_google_genai-2.0.5.tar.gz
Algorithm Hash digest
SHA256 f027d845091b147537481c1e17adb25a8acfca1fa3b3c7b7c7ca16463ad37173
MD5 713c4eb19d32814b4c9f3ef7226323d2
BLAKE2b-256 b3a3695b13855a23893b62f2d97783c6d2e90c8b6b189eec8373d97587770346

See more details on using hashes here.

File details

Details for the file langchain_google_genai-2.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_google_genai-2.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d33fe5c32dccffc91c47fa30c99ece4c136c1a7d1330b154b7df43029b041bf3
MD5 580f06e60fb82e2cceec8004e9628770
BLAKE2b-256 c784e66cfded0e68558e195338d0109dfcd8742f7bdc8aaaa39793a482857a7d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page