Skip to main content

LLM plugin for IBM watsonx models

Project description

llm-watsonx

PyPI License

Installation

Install this plugin in the same environment as LLM. From the current directory

llm install llm-watsonx

Configuration

You will need to provide the following:

export WATSONX_API_KEY=
export WATSONX_PROJECT_ID=
  • Optionally, if your watsonx instance is not in us-south:
export WATSONX_URL=

Usage

Get list of commands:

llm watsonx --help

Models

See all available models:

llm watsonx list-models

See all generation options:

llm watsonx list-model-options

Example

llm -m watsonx/meta-llama/llama-3-8b-instruct \
    -o temperature .4 \
    -o max_new_tokens 250 \
    "What is IBM watsonx?"

Chat Example

llm chat -m watsonx/meta-llama/llama-3-8b-instruct \
    -o max_new_tokens 1000 \
    -s "You are an assistant for a CLI (command line interface). Provide and help give unix commands to help users achieve their tasks."

Embeddings

See all available models:

llm watsonx list-embedding-models

Example

cat README.md | llm embed -m watsonx/ibm/slate-30m-english-rtrvr

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_watsonx-0.1.1.tar.gz (7.9 kB view hashes)

Uploaded Source

Built Distribution

llm_watsonx-0.1.1-py3-none-any.whl (8.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page