Skip to main content

An integration package connecting IBM watsonx.ai and LangChain

Project description

langchain-ibm

This package provides the integration between LangChain and IBM watsonx.ai through the ibm-watsonx-ai SDK.

Installation

To use the langchain-ibm package, follow these installation steps:

pip install -U langchain-ibm

Setting up

To use IBM's models, you must have an IBM Cloud user API key. Here's how to obtain and set up your API key:

  1. Obtain an API Key: For more details on how to create and manage an API key, refer to IBM's documentation.
  2. Set the API Key as an Environment Variable: For security reasons, it's recommended to not hard-code your API key directly in your scripts. Instead, set it up as an environment variable. You can use the following code to prompt for the API key and set it as an environment variable:
import os
from getpass import getpass

watsonx_api_key = getpass()
os.environ["WATSONX_API_KEY"] = watsonx_api_key

In alternative, you can set the environment variable in your terminal.

  • Linux/macOS: Open your terminal and execute the following command:

    export WATSONX_API_KEY='your_ibm_api_key'
    

    To make this environment variable persistent across terminal sessions, add the above line to your ~/.bashrc, ~/.bash_profile, or ~/.zshrc file.

  • Windows: For Command Prompt, use:

    set WATSONX_API_KEY=your_ibm_api_key
    

Setting parameters

You might need to adjust model parameters for different models or tasks. For more details on the parameters, refer to Parameter Scheme IBM's documentation.

Note: You must use the correct parameter schema for the class you are initializing:

This example uses ChatWatsonx, so we import TextChatParameters.

from ibm_watsonx_ai.foundation_models.schema import TextChatParameters

parameters = TextChatParameters(
    temperature=0.5,
    max_completion_tokens=1024,
    top_p=1,
)

You can also pass it as a dictionary object.

parameters = {
    "temperature": 0.5,
    "max_completion_tokens": 1024,
    "top_p": 1,
}

Chat Models

ChatWatsonx class exposes chat models from IBM.

Initialization the ChatWatsonx class with the previously set parameters.

from langchain_ibm import ChatWatsonx

model = ChatWatsonx(
    model_id="PASTE THE CHOSEN MODEL_ID HERE",
    url="PASTE YOUR URL HERE",
    project_id="PASTE YOUR PROJECT_ID HERE",
    params=parameters,
)

model.invoke("Sing a ballad of LangChain.")

Note:

  • You must provide a project_id or space_id. For more information refer to IBM's documentation.
  • Depending on the region of your provisioned service instance, use one of the urls described here.
  • You need to specify the model you want to use for inferencing through model_id. You can find the list of available models here.

Alternatively for all classes you can use Cloud Pak for Data credentials. For more details, refer to IBM's documentation.

from langchain_ibm import ChatWatsonx

model = ChatWatsonx(
    model_id="ibm/granite-3-3-8b-instruct",
    url="PASTE YOUR URL HERE",
    username="PASTE YOUR USERNAME HERE",
    password="PASTE YOUR PASSWORD HERE",
    project_id="PASTE YOUR PROJECT_ID HERE",
    params=parameters,
)

Embedding Models

WatsonxEmbeddings class exposes embeddings from IBM.

from langchain_ibm import WatsonxEmbeddings
from ibm_watsonx_ai.metanames import EmbedTextParamsMetaNames

embed_params = {
    EmbedTextParamsMetaNames.TRUNCATE_INPUT_TOKENS: 3,
    EmbedTextParamsMetaNames.RETURN_OPTIONS: {"input_text": True},
}

embeddings = WatsonxEmbeddings(
    model_id="ibm/granite-embedding-107m-multilingual",
    url="https://us-south.ml.cloud.ibm.com",
    project_id="PASTE YOUR PROJECT_ID HERE",
    params=embed_params,
)

embeddings.embed_query("What is the meaning of life?")

LLMs

WatsonxLLM class exposes LLMs from IBM.

from langchain_ibm import WatsonxLLM
from ibm_watsonx_ai.foundation_models.schema import TextGenParameters, TextGenDecodingMethod

parameters = TextGenParameters(
    decoding_method=TextGenDecodingMethod.SAMPLE,
    temperature=0.5,
    top_k=50,
    top_p=1
)

llm = WatsonxLLM(
    model_id="ibm/granite-3-3-8b-instruct",
    url="https://us-south.ml.cloud.ibm.com",
    project_id="PASTE YOUR PROJECT_ID HERE",
    params=parameters,
)

llm.invoke("The meaning of life is")

Reranker

WatsonxRerank class exposes reranker from IBM.

from langchain_ibm import WatsonxRerank

rerank = WatsonxRerank(
    model_id="cross-encoder/ms-marco-minilm-l-12-v2",
    url="https://us-south.ml.cloud.ibm.com",
    project_id="PASTE YOUR PROJECT_ID HERE",
)

Toolkit

WatsonxToolkit class exposes Toolkit from IBM.

from langchain_ibm.agent_toolkits.utility import WatsonxToolkit

watsonx_toolkit = WatsonxToolkit(
    url="https://us-south.ml.cloud.ibm.com",
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_ibm-1.0.1.tar.gz (60.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_ibm-1.0.1-py3-none-any.whl (49.7 kB view details)

Uploaded Python 3

File details

Details for the file langchain_ibm-1.0.1.tar.gz.

File metadata

  • Download URL: langchain_ibm-1.0.1.tar.gz
  • Upload date:
  • Size: 60.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langchain_ibm-1.0.1.tar.gz
Algorithm Hash digest
SHA256 1418e65dd00f4ed5dc9afc565bd46ef60dffe2b06b3ef9fcd57a98df793c0f23
MD5 c930a099e4d87c1dd6212665079b4f76
BLAKE2b-256 3f49e83d41cab8bf5f8e804398c9d755aeaa90686778839b9f7cf81fa674d634

See more details on using hashes here.

File details

Details for the file langchain_ibm-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: langchain_ibm-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 49.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langchain_ibm-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5cb1e795e61174c1b62b9236c9a1f8f19f59c9d7f5912a1549e8c471970f33b1
MD5 4f8d06bfb1054fdbadd26edd74dd8072
BLAKE2b-256 766f01aa98c144e19315e7aa374092c7b775aa1c2eb803d657581224f6d3f08b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page