An integration package connecting OCI and LangChain
Project description
langchain-oci
This package contains the LangChain integrations with oci.
Installation
pip install -U langchain-oci
All integrations in this package assume that you have the credentials setup to connect with oci services.
Quick Start
This repository includes two main integration categories:
OCI Generative AI Examples
1. Use a Chat Model
ChatOCIGenAI class exposes chat models from OCI Generative AI.
from langchain_oci import ChatOCIGenAI
llm = ChatOCIGenAI(
model_id="MY_MODEL_ID",
service_endpoint="MY_SERVICE_ENDPOINT",
compartment_id="MY_COMPARTMENT_ID",
model_kwargs={"max_tokens": 1024}, # Use max_completion_tokens instead of max_tokens for OpenAI models
auth_profile="MY_AUTH_PROFILE",
is_stream=True,
auth_type="SECURITY_TOKEN"
llm.invoke("Sing a ballad of LangChain.")
2. Use a Completion Model
OCIGenAI class exposes LLMs from OCI Generative AI.
from langchain_oci import OCIGenAI
llm = OCIGenAI()
llm.invoke("The meaning of life is")
3. Use an Embedding Model
OCIGenAIEmbeddings class exposes embeddings from OCI Generative AI.
from langchain_oci import OCIGenAIEmbeddings
embeddings = OCIGenAIEmbeddings()
embeddings.embed_query("What is the meaning of life?")
4. Use Structured Output
ChatOCIGenAI supports structured output.
Note: The default method is function_calling. If default method returns None (e.g. for Gemini models), try json_schema or json_mode.
from langchain_oci import ChatOCIGenAI
from pydantic import BaseModel
class Joke(BaseModel):
setup: str
punchline: str
llm = ChatOCIGenAI()
structured_llm = llm.with_structured_output(Joke)
structured_llm.invoke("Tell me a joke about programming")
5. Use OpenAI Responses API
ChatOCIOpenAI supports OpenAI Responses API.
from oci_openai import (
OciSessionAuth,
)
from langchain_oci import ChatOCIOpenAI
client = ChatOCIOpenAI(
auth=OciSessionAuth(profile_name="MY_PROFILE_NAME"),
compartment_id="MY_COMPARTMENT_ID",
region="us-chicago-1",
model="openai.gpt-4.1",
conversation_store_id="MY_CONVERSATION_STORE_ID"
)
messages = [
(
"system",
"You are a helpful translator. Translate the user sentence to French.",
),
("human", "I love programming."),
]
response = client.invoke(messages)
NOTE: By default store argument is set to True which requires passing conversation_store_id. You can set store to False and not pass conversation_store_id.
from oci_openai import (
OciSessionAuth,
)
from langchain_oci import ChatOCIOpenAI
client = ChatOCIOpenAI(
auth=OciSessionAuth(profile_name="MY_PROFILE_NAME"),
compartment_id="MY_COMPARTMENT_ID",
region="us-chicago-1",
model="openai.gpt-4.1",
store=False
)
messages = [
(
"system",
"You are a helpful translator. Translate the user sentence to French.",
),
("human", "I love programming."),
]
response = client.invoke(messages)
OCI Data Science Model Deployment Examples
1. Use a Chat Model
You may instantiate the OCI Data Science model with the generic ChatOCIModelDeployment or framework specific class like ChatOCIModelDeploymentVLLM.
from langchain_oci.chat_models import ChatOCIModelDeployment, ChatOCIModelDeploymentVLLM
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri with your own
endpoint = "https://modeldeployment.<region>.oci.customer-oci.com/<ocid>/predict"
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
chat = ChatOCIModelDeployment(
endpoint=endpoint,
streaming=True,
max_retries=1,
model_kwargs={
"temperature": 0.2,
"max_tokens": 512,
}, # other model params...
default_headers={
"route": "/v1/chat/completions",
# other request headers ...
},
)
chat.invoke(messages)
chat_vllm = ChatOCIModelDeploymentVLLM(endpoint=endpoint)
chat_vllm.invoke(messages)
2. Use a Completion Model
You may instantiate the OCI Data Science model with OCIModelDeploymentLLM or OCIModelDeploymentVLLM.
from langchain_oci.llms import OCIModelDeploymentLLM, OCIModelDeploymentVLLM
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
endpoint = "https://modeldeployment.<region>.oci.customer-oci.com/<ocid>/predict"
llm = OCIModelDeploymentLLM(
endpoint=endpoint,
model="odsc-llm",
)
llm.invoke("Who is the first president of United States?")
vllm = OCIModelDeploymentVLLM(
endpoint=endpoint,
)
vllm.invoke("Who is the first president of United States?")
3. Use an Embedding Model
You may instantiate the OCI Data Science model with the OCIModelDeploymentEndpointEmbeddings.
from langchain_oci.embeddings import OCIModelDeploymentEndpointEmbeddings
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri with your own
endpoint = "https://modeldeployment.<region>.oci.customer-oci.com/<ocid>/predict"
embeddings = OCIModelDeploymentEndpointEmbeddings(
endpoint=endpoint,
)
query = "Hello World!"
embeddings.embed_query(query)
documents = ["This is a sample document", "and here is another one"]
embeddings.embed_documents(documents)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_oci-0.2.0.tar.gz.
File metadata
- Download URL: langchain_oci-0.2.0.tar.gz
- Upload date:
- Size: 38.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7b220b41fcfb86a26c815919033e4c13cd1a641ebd1c6174993bce2bb96ba036
|
|
| MD5 |
2debbd8d72a8ded1d65a36dbb2c90342
|
|
| BLAKE2b-256 |
9c4d09e985d91df41da240e50b1cd9f4443683a68aada3a466c1d601e8b40621
|
File details
Details for the file langchain_oci-0.2.0-py3-none-any.whl.
File metadata
- Download URL: langchain_oci-0.2.0-py3-none-any.whl
- Upload date:
- Size: 44.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3abe22811201a2eaef04385ae11023e47ad4a26c14fc9dc2f25aeeea6692cad8
|
|
| MD5 |
b0a52c452ba34b937cad4d904c0e8b2e
|
|
| BLAKE2b-256 |
64ad1f7e66b89f2eed244a6aab40ead1545bb1033b9021983d783029dc571b96
|