An integration package connecting Google VertexAI and LangChain
Project description
gigachain-google-vertexai
This package contains the LangChain integrations for Google Cloud generative models.
Installation
pip install -U gigachain-google-vertexai
Chat Models
ChatVertexAI
class exposes models such as gemini-pro
and chat-bison
.
To use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as:
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")
You can use other models, e.g. chat-bison
:
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="chat-bison", temperature=0.3)
llm.invoke("Sing a ballad of LangChain.")
Multimodal inputs
Gemini vision model supports image inputs when providing a single chat message. Example:
from langchain_core.messages import HumanMessage
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="gemini-pro-vision")
# example
message = HumanMessage(
content=[
{
"type": "text",
"text": "What's in this image?",
}, # You can optionally provide text parts
{"type": "image_url", "image_url": {"url": "https://picsum.photos/seed/picsum/200/300"}},
]
)
llm.invoke([message])
The value of image_url
can be any of the following:
- A public image URL
- An accessible gcs file (e.g., "gcs://path/to/file.png")
- A local file path
- A base64 encoded image (e.g.,
data:image/png;base64,abcd124
)
Embeddings
You can use Google Cloud's embeddings models as:
from langchain_google_vertexai import VertexAIEmbeddings
embeddings = VertexAIEmbeddings()
embeddings.embed_query("hello, world!")
LLMs
You can use Google Cloud's generative AI models as Langchain LLMs:
from langchain_core.prompts import PromptTemplate
from langchain_google_vertexai import ChatVertexAI
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)
llm = ChatVertexAI(model_name="gemini-pro")
chain = prompt | llm
question = "Who was the president of the USA in 1994?"
print(chain.invoke({"question": question}))
You can use Gemini and Palm models, including code-generations ones:
from langchain_google_vertexai import VertexAI
llm = VertexAI(model_name="code-bison", max_output_tokens=1000, temperature=0.3)
question = "Write a python function that checks if a string is a valid email address"
output = llm(question)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for gigachain_google_vertexai-0.1.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | f0648f24140c1c74e65acc7e17a434784c12e6e1f9753adb2c7c004c5bcdd0c1 |
|
MD5 | cf6fa4c787d9dc2c990913996bd0607d |
|
BLAKE2b-256 | cb32ea1a714f7718458de9de99b162efa35ffb8d99121249d35e563335d199d6 |
Hashes for gigachain_google_vertexai-0.1.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f39e28d4076646ecfbbadeb20b4d0da39c1ae32abbf12aaf7f777d989858e95a |
|
MD5 | 35f00b353841621a79e7555410f6a06e |
|
BLAKE2b-256 | 1c7d3fcaea1af4550527d780c70b1772fef5142d605d4b00c04cbfc7e179b2e1 |