CLI for deploying AI services with Agents to watsonx.ai.
Project description
ibm-watsonx-ai-cli
CLI tool that enables developers to deploy AI services to watsonx.ai directly from their local environment.
IBM watsonx.ai CLI for Agents
Repository containing source code for ibm-watsonx-ai-cli package.
Contact info
Maintainers
- Łukasz Ćmielowski - lukasz.cmielowski@pl.ibm.com
- Daniel Ryszka – daniel.ryszka@pl.ibm.com
- Mateusz Szewczyk – mateusz.szewczyk@ibm.com
- Mateusz Świtała – mateusz.switala@ibm.com
Installation
pip install -U ibm-watsonx-ai-cli
[!WARNING] The
ibm_watsonx_ai_clipackage requirespoetryto be installed. To install Poetry, follow the instructions on the Poetry installation page.
Usage
Run CLI
$ watsonx-ai [OPTIONS] COMMAND [ARGS]...
[OPTIONS]
--version -v Print the current CLI version.
--help Show this message and exit.
COMMAND
- template Explore, download and try-out the template.
- service Work with deployed templates.
- app Build & run an UI playground.
TEMPLATE
- list List all available templates.
- new Creates a selected template in a local env. [ARGS: name, target]
- invoke Executes the template code locally with demo data. [OPTIONAL ARGS: query]
SERVICE
- list List all deployed AI services.
- new Create & deploy a new ai service from a root template directory. [OPTIONAL ARGS: name]
- get Get service details. [OPTIONAL ARGS: deployment_id (deployment_id)]
- delete Deletes ai service. [ARGS: deployment_id]
- invoke Calls the service by providing the test record. [OPTIONAL ARGS: deployment_id, query]
APP
- list List playground app samples.
- new Creates a demo playground app for the service. [OPTIONAL ARGS: name, target_dir]
- run Start the playground app. [OPTIONAL ARGS: target_dir]
Template
watsonx-ai template list
List all available templates.
Usage:
$ watsonx-ai template list [OPTIONS]
Options:
--help: Show this message and exit.
watsonx-ai template new
Creates a selected template in a local env.
Usage:
$ watsonx-ai template new [OPTIONS] name target
Arguments:
name: Name of the template to download or its index in the list of available templates.target: Target directory where template will be saved.
Options:
--help: Show this message and exit.
watsonx-ai template invoke
Executes the template code locally with demo data.
Usage:
$ watsonx-ai template invoke [OPTIONS] query
Arguments:
query:[Optional] Input query for template code. If not specified, searches for thecli.options.payload_pathfile to be specified in config.toml. Content ofcli.options.payload_pathwill be directly send to the AI service and should agree with AI service request schema.
Options:
--help: Show this message and exit.
Service
watsonx-ai service list
List all deployed AI services.
Usage:
$ watsonx-ai service list [OPTIONS]
Options:
--help: Show this message and exit.
watsonx-ai service new
Create & deploy a new AI service from a root template directory. After successful deployment creation the deployment_id will be saved automatically in section [deployment] in config.toml.
When running this command a package distribution with implemented agent will be created in folder dist. Warning! If dist already exists, distribution creation is skipped.
Usage:
$ watsonx-ai service new [OPTIONS] name
Arguments:
name:[Optional] Name for the deployed AI service. If not provided, name for the deployment will be the same as name of current directory.
Options:
--help: Show this message and exit.
watsonx-ai service get
Get deployed service details.
Usage:
$ watsonx-ai service get [OPTIONS] deployment_id
Arguments:
deployment_id: Id of the deployed AI service.
Options:
--help: Show this message and exit.
watsonx-ai service delete
Deletes ai service.
Usage:
$ watsonx-ai service delete [OPTIONS] deployment_id
Arguments:
deployment_id: Id of the deployed AI service.
Options:
--help: Show this message and exit.
watsonx-ai service invoke
Calls the service by providing the test record.
Usage:
$ watsonx-ai service invoke [OPTIONS] query
Arguments:
query:[Optional] Test data to send to deployed AI service. If not specified, searches for thecli.options.payload_pathfile to be specified in config.toml. Content ofcli.options.payload_pathwill be directly send to the deployed AI service and should agree with AI service request schema.
Options:
--deployment_id:[Optional] Id of a deployed AI service. If not provided, taken fromdeployment.deployment_idfield fromconfig.toml.--help: Show this message and exit.
App
watsonx-ai app list
List playground app samples.
Usage:
$ watsonx-ai app list [OPTIONS]
Options:
--help: Show this message and exit.
watsonx-ai app new
Creates a demo playground app for the service.
Usage:
$ watsonx-ai app new [OPTIONS] name target_dir
Arguments:
name:[Optional] The name of the app to use. If not provided, the user will be prompted to choose one.target_dir:[Optional] The target folder where the app will be downloaded. If not provided, the user will be prompted to enter one.
Options:
--help: Show this message and exit.
watsonx-ai app run
Start the playground app.
Usage:
$ watsonx-ai app run [OPTIONS] target_dir
Arguments:
target_dir:[Optional] The directory to the app.
Options:
--help: Show this message and exit.--dev | -d:[Optional] A flag indicating either app should be deployed in developer mode.
Secrets
Environment Variables for IBM watsonx.ai for IBM Cloud
For security reasons, it's recommended to not hard-code your API key or other secrets directly in your scripts or config.toml. Instead, set it up as an environment variable. Below variables will be used to authenticate to watsonx.ai APIs if analogous ones not provided in configuration file.
import os
WATSONX_URL = os.environ.get("WATSONX_URL", "")
WATSONX_APIKEY = os.environ.get("WATSONX_APIKEY", "")
WATSONX_TOKEN = os.environ.get("WATSONX_TOKEN", "")
WATSONX_SPACE_ID = os.environ.get("WATSONX_SPACE_ID", "")
WATSONX_PROJECT_ID = os.environ.get("WATSONX_PROJECT_ID", "")
Config file (config.toml)
config.toml is an optional file that can be defined in template's directory. Please note, that some templates will require a correctly prepared config.toml since template's content may depends on some of the available configuration options. If a downloaded template has config.toml with placeholders, please read its description carefully and fill in required fields.
Available configuration options
Below are all the sections and options you can have in your config.toml
[cli.options]
[cli.options]
# If true, cli `invoke` command is trying to use `ai_service.generate_stream` function for local tests, and `ai_service.generate` otherwise.
# Default: true
stream = true
# Path to json file with a complete payload that will be send to proper AI service generate function.
# Note that, the payload file will be only used when no `query` is provided when running `invoke ` command
# Default: None
payload_path = ""
[deployment]
If present, its fields will be used for authentication to watsonx.ai APIs.
[deployment]
# One of the below is required.
# To determine your `api_key`, refer to `IBM Cloud console API keys <https://cloud.ibm.com/iam/apikeys>`_.
watsonx_apikey = "PLACEHOLDER FOR YOUR APIKEY"
watsonx_token = "PLACEHOLDER FOR YOUR TOKEN"
# should follow the format: `https://{REGION}.ml.cloud.ibm.com`
watsonx_url = ""
# Deployment space id is required to create deployment with AI service content.
space_id = "PLACEHOLDER FOR YOUR SPACE ID"
# variable, that is populated with last created deployment_id every time when command `watsonx-ai service new` finish successfully
deployment_id = ""
[deployment.online.parameters]
This section is recommended for users of IBM watsonx.ai for IBM Cloud and when some additional parameters need to be passed to the AI service function during deployment creation. When creating a deployment, additional parameters can be passed inside the ONLINE.PARAMETERS object to further reference in the content of the AI service function. The exact set of parameters may depend on the template used.
[deployment.online.parameters]
# Example set of parameters that must be passed during deployment creation of particular template to guarantee its correct functioning
url = "" # should follow the format: `https://{REGION}.ml.cloud.ibm.com`
space_id = "PLACEHOLDER FOR YOUR SPACE ID"
model_id = "mistralai/mistral-large" # underlying model for inference
[deployment.software_specification]
During deployment creation, the custom software specification with template package distribution is stored for use when running AI service inference. For more details about software specifications you can find in IBM watsonx.ai Runtime Supported frameworks and software specifications documentation and Customizing deployment runtimes documentation.
[deployment.software_specification]
# Name for derived software specification. If not provided, default one is used that will be build based on the package name: "{pkg_name}-sw-spec"
name = ""
# Whether to overwrite (delete existing and create new with the same name) watsonx derived software specification
# Default: false
overwrite = false
# The base software specification used to deploy the AI service. The template dependencies will be installed based on the packages included in the selected base software specification
# Default: "runtime-24.1-py3.11"
base_sw_spec = "runtime-24.1-py3.11"
Sample flow
- List available templates
watsonx-ai template list - Create the template in my IDE
watsonx-ai template new "base/langgraph-react-agent" "langgraph-react-agent" - Go into template directory
cd langgraph-react-agent cp config.toml.example config.toml
- Customize the template code and
config.tomlto my needs - Test the template code by calling invoke:
watsonx-ai template invoke "question" - Create a service
watsonx-ai service new - Test the service
watsonx-ai service invoke "question" - List available apps
watsonx-ai app list - Create an app in my IDE
watsonx-ai app new "base/nextjs-chat-with-ai-service" "chat-with-ai-service_app" - Go into app directory and copy your environment variable file.
cd chat-with-ai-service_app
cp template.env .env
- Update
.envfile with your credentials - Start the playground app.
watsonx-ai app run
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ibm_watsonx_ai_cli-0.1.5.tar.gz.
File metadata
- Download URL: ibm_watsonx_ai_cli-0.1.5.tar.gz
- Upload date:
- Size: 26.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b2a99b073c01aad720cfd93131812711ae84a44ed7515c4629fda1cd12f72259
|
|
| MD5 |
1b7f5dceaf210a62f66fe85ab9e52871
|
|
| BLAKE2b-256 |
d5990ffccd791c846587659346c970231cd6b1024e000f164d5e8ed4d525374c
|
File details
Details for the file ibm_watsonx_ai_cli-0.1.5-py3-none-any.whl.
File metadata
- Download URL: ibm_watsonx_ai_cli-0.1.5-py3-none-any.whl
- Upload date:
- Size: 31.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a4d8687e7dd4d94c255709c245ba575b4db3ff394a0695bf2fa6966fb5061372
|
|
| MD5 |
54d2abda59ab18e207f391a4dd70a571
|
|
| BLAKE2b-256 |
4ee5dc146b59f5899863039a45d96464948dd0055686aa1f10100df1fff97118
|