Skip to main content

IBM watsonx.ai API Client

Reason this release was yanked:

This is special release for CPD 5.4 containing 1.5.5 changes + Semantic Schema support + local models via Model Gateway support

Project description

📦 ibm-watsonx-ai

Official IBM watsonx.ai Python SDK


IBM Python License

PyPI Downloads Docs Examples


Enterprise-grade Python client for building, tuning and deploying AI models with IBM watsonx.ai

🚀 Quick Start📘 Documentation📓 Examples

📌 Overview

ibm-watsonx-ai is the official Python SDK for IBM watsonx.ai, an enterprise-grade AI platform for building, training, tuning, deploying, and operating AI models at scale.

The SDK provides a unified and production-ready Python interface to the full watsonx.ai ecosystem, including Foundation Models (within LLMs), AutoAI experiments, Retrieval-Augmented Generation (RAG), model tuning, deployment, and data integration.

With ibm-watsonx-ai, developers and data scientists can seamlessly integrate advanced AI capabilities into Python applications running on IBM watsonx.ai for IBM Cloud or IBM watsonx.ai software, while meeting enterprise requirements such as security, governance, and scalability.


🎯 What This SDK Is Used For

The ibm-watsonx-ai SDK is designed to support the entire AI lifecycle:

  • 🔐 Secure authentication and environment configuration
  • 🤖 Inference with Foundation Models (LLMs, embeddings, time-series, audio)
  • 📚 Building Retrieval-Augmented Generation (RAG) systems
  • 🧪 Running and optimizing AutoAI experiments
  • ⚙️ Fine-tuning and prompt tuning of models
  • 🚀 Deploying models to scalable inference endpoints
  • 🔗 Integrating enterprise data sources into AI workflows

It is suitable for research, prototyping, and production deployments.


📦 Installation

Install from PyPI:

pip install ibm-watsonx-ai

Install with optional extras:

pip install "ibm-watsonx-ai[rag]"
Extra Description
rag Retrieval‑Augmented Generation utilities
mcp Model Context Protocol

🚀 Quick Start

Authentication

Set up your Credentials and create an APIClient instance:

from ibm_watsonx_ai import Credentials, APIClient

credentials = Credentials(
    url="https://us-south.ml.cloud.ibm.com",
    api_key="<your-ibm-api-key>"
)

# Initialize APIClient using a space ID (you can also use a project ID)
api_client = APIClient(credentials, space_id="<your-space-id>")

Working with ModelInference

List available chat models

list(api_client.foundation_models.ChatModels)

# Output example:
# [<ChatModels.GRANITE_3_3_8B_INSTRUCT: 'ibm/granite-3-3-8b-instruct'>,
#  <ChatModels.GRANITE_4_H_SMALL: 'ibm/granite-4-h-small'>,
#  <ChatModels.LLAMA_3_3_70B_INSTRUCT: 'meta-llama/llama-3-3-70b-instruct'>,
#  <ChatModels.GPT_OSS_120B: 'openai/gpt-oss-120b'>]

Initialize ModelInference

from ibm_watsonx_ai.foundation_models import ModelInference

# Create a `ModelInference` instance for the selected model
model = ModelInference(
    api_client=api_client,
    model_id="ibm/granite-4-h-small"
)

Chat with the model

# Prepare messages
messages = [
    {"role": "system", "content": "You are a helpful assistant that translates English to French."},
    {"role": "user", "content": "I love you for listening to Rock."}
]

# Get model response
response = model.chat(messages=messages)
print(response)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ibm_watsonx_ai-1.5.8.tar.gz (721.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ibm_watsonx_ai-1.5.8-py3-none-any.whl (1.1 MB view details)

Uploaded Python 3

File details

Details for the file ibm_watsonx_ai-1.5.8.tar.gz.

File metadata

  • Download URL: ibm_watsonx_ai-1.5.8.tar.gz
  • Upload date:
  • Size: 721.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for ibm_watsonx_ai-1.5.8.tar.gz
Algorithm Hash digest
SHA256 0f64faf4faac6a8a81687b1b83a1cec3d209c0e9f92ee8745ae5dbb26f93037b
MD5 e928346aaa910030db2f0c2bec77a9de
BLAKE2b-256 5ce887f1a08bf79567fbb336512bcc7f3288a933670de2e4d31ecb279c58b088

See more details on using hashes here.

File details

Details for the file ibm_watsonx_ai-1.5.8-py3-none-any.whl.

File metadata

  • Download URL: ibm_watsonx_ai-1.5.8-py3-none-any.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for ibm_watsonx_ai-1.5.8-py3-none-any.whl
Algorithm Hash digest
SHA256 a3d6f2edfb7f024e3274722b13024b9d45a916d73330202bedca701ab7a5de48
MD5 0471ca4a9c2b1cad4b9d120dbde27aef
BLAKE2b-256 85b50e8fe88f4e26d44b768f9f3766c13e5659566775e5dad8211e4a6be067d2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page