Skip to main content

IBM watsonx.ai API Client

Project description

📦 ibm-watsonx-ai

Official IBM watsonx.ai Python SDK


IBM Python License

PyPI Downloads Docs Examples


Enterprise-grade Python client for building, tuning and deploying AI models with IBM watsonx.ai

🚀 Quick Start📘 Documentation📓 Examples

📌 Overview

ibm-watsonx-ai is the official Python SDK for IBM watsonx.ai, an enterprise-grade AI platform for building, training, tuning, deploying, and operating AI models at scale.

The SDK provides a unified and production-ready Python interface to the full watsonx.ai ecosystem, including Foundation Models (within LLMs), AutoAI experiments, Retrieval-Augmented Generation (RAG), model tuning, deployment, and data integration.

With ibm-watsonx-ai, developers and data scientists can seamlessly integrate advanced AI capabilities into Python applications running on IBM watsonx.ai for IBM Cloud or IBM watsonx.ai software, while meeting enterprise requirements such as security, governance, and scalability.


🎯 What This SDK Is Used For

The ibm-watsonx-ai SDK is designed to support the entire AI lifecycle:

  • 🔐 Secure authentication and environment configuration
  • 🤖 Inference with Foundation Models (LLMs, embeddings, time-series, audio)
  • 📚 Building Retrieval-Augmented Generation (RAG) systems
  • 🧪 Running and optimizing AutoAI experiments
  • ⚙️ Fine-tuning and prompt tuning of models
  • 🚀 Deploying models to scalable inference endpoints
  • 🔗 Integrating enterprise data sources into AI workflows

It is suitable for research, prototyping, and production deployments.


📦 Installation

Install from PyPI:

pip install ibm-watsonx-ai

Install with optional extras:

pip install "ibm-watsonx-ai[rag]"
Extra Description
rag Retrieval‑Augmented Generation utilities
mcp Model Context Protocol

🚀 Quick Start

Authentication

from ibm_watsonx_ai import Credentials, APIClient

credentials = Credentials(
    url="https://us-south.ml.cloud.ibm.com",
    api_key="<your-ibm-api-key>"
)

api_client = APIClient(credentials, space_id="<your-space-id>")

Working with ModelInference

Get available models

list(api_client.foundation_models.ChatModels)

# [<ChatModels.GRANITE_3_3_8B_INSTRUCT: 'ibm/granite-3-3-8b-instruct'>,
#  <ChatModels.GRANITE_4_H_SMALL: 'ibm/granite-4-h-small'>,
#  <ChatModels.LLAMA_3_3_70B_INSTRUCT: 'meta-llama/llama-3-3-70b-instruct'>,
#  <ChatModels.GPT_OSS_120B: 'openai/gpt-oss-120b'>]

Initialisation

from ibm_watsonx_ai.foundation_models import ModelInference

model = ModelInference(
    api_client=api_client,
    model_id="ibm/granite-4-h-small"
)

Chat with model

messages = [
    {"role": "system", "content": "You are a helpful assistant that translates English to French."},
    {"role": "user", "content": "I love you for listening to Rock."}
]

model.chat(messages=messages)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ibm_watsonx_ai-1.5.3.tar.gz (710.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ibm_watsonx_ai-1.5.3-py3-none-any.whl (1.1 MB view details)

Uploaded Python 3

File details

Details for the file ibm_watsonx_ai-1.5.3.tar.gz.

File metadata

  • Download URL: ibm_watsonx_ai-1.5.3.tar.gz
  • Upload date:
  • Size: 710.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for ibm_watsonx_ai-1.5.3.tar.gz
Algorithm Hash digest
SHA256 610c982416e18479e2029d16062e992d42a5454b6db5ed68541aa53b8f3bfa54
MD5 9000070d05e47a9faadf24f3049e44cd
BLAKE2b-256 50a019e0bfefe4fe3409bb73821198a24eac931f12f8c2ea2eb05c98995dffaa

See more details on using hashes here.

File details

Details for the file ibm_watsonx_ai-1.5.3-py3-none-any.whl.

File metadata

  • Download URL: ibm_watsonx_ai-1.5.3-py3-none-any.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for ibm_watsonx_ai-1.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a7a1af3bebd8271e0fb7a04b17792131ad7f3e697cd4aea0aae0687695cb884f
MD5 a0dca2864ca5233d77c069fbd8f8772d
BLAKE2b-256 9989f6a113d08bd8796c8ed82b151009479fe7e8a7b4792a12c98918b806c0e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page