Skip to main content

An integration package connecting Google VertexAI and LangChain

Project description

langchain-google-vertexai

This package contains the LangChain integrations for Google Cloud generative models.

Installation

pip install -U langchain-google-vertexai

Chat Models

ChatVertexAI class exposes models such as gemini-pro and chat-bison.

To use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as:

from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")

You can use other models, e.g. chat-bison:

from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="chat-bison", temperature=0.3)
llm.invoke("Sing a ballad of LangChain.")

Multimodal inputs

Gemini vision model supports image inputs when providing a single chat message. Example:

from langchain_core.messages import HumanMessage
from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="gemini-pro-vision")
# example
message = HumanMessage(
    content=[
        {
            "type": "text",
            "text": "What's in this image?",
        },  # You can optionally provide text parts
        {"type": "image_url", "image_url": {"url": "https://picsum.photos/seed/picsum/200/300"}},
    ]
)
llm.invoke([message])

The value of image_url can be any of the following:

  • A public image URL
  • An accessible gcs file (e.g., "gcs://path/to/file.png")
  • A base64 encoded image (e.g., data:image/png;base64,abcd124)

Embeddings

You can use Google Cloud's embeddings models as:

from langchain_google_vertexai import VertexAIEmbeddings

embeddings = VertexAIEmbeddings()
embeddings.embed_query("hello, world!")

LLMs

You can use Google Cloud's generative AI models as Langchain LLMs:

from langchain_core.prompts import PromptTemplate
from langchain_google_vertexai import ChatVertexAI

template = """Question: {question}

Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)

llm = ChatVertexAI(model_name="gemini-pro")
chain = prompt | llm

question = "Who was the president of the USA in 1994?"
print(chain.invoke({"question": question}))

You can use Gemini and Palm models, including code-generations ones:

from langchain_google_vertexai import VertexAI

llm = VertexAI(model_name="code-bison", max_output_tokens=1000, temperature=0.3)

question = "Write a python function that checks if a string is a valid email address"

output = llm(question)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_google_vertexai_yd-2.0.21rc1.tar.gz (82.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file langchain_google_vertexai_yd-2.0.21rc1.tar.gz.

File metadata

File hashes

Hashes for langchain_google_vertexai_yd-2.0.21rc1.tar.gz
Algorithm Hash digest
SHA256 13a16cb8e6f72b01db10beca80d570343fcc066a389cca53414ff623aacc1004
MD5 1cd64ae58ced34687b4de1eb8ba32e15
BLAKE2b-256 d2aef1020a3b5ed36c31631d95a22433f8e67735d2fa2fab33de65a7bc97d64d

See more details on using hashes here.

File details

Details for the file langchain_google_vertexai_yd-2.0.21rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_google_vertexai_yd-2.0.21rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 b4bee540c6f6c0c45dc025abac724855bc77c066cf8db27fa71012a3e39be912
MD5 b8fe25f0506e8a9ce8bc717ccc026df5
BLAKE2b-256 91c8595922ccaccb0556838ce1000c8d4652955d8232ce30ce1004f5380eed97

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page