LLM plugin for IBM watsonx models
Project description
llm-watsonx
Installation
Install this plugin in the same environment as LLM. From the current directory
llm install llm-watsonx
Configuration
You will need to provide the following:
- API Key from IBM Cloud IAM: https://cloud.ibm.com/iam/apikeys
- Project ID (from watsonx.ai instance URL: https://dataplatform.cloud.ibm.com/projects//)
export WATSONX_API_KEY=
export WATSONX_PROJECT_ID=
- Optionally, if your watsonx instance is not in
us-south
:
export WATSONX_URL=
Usage
Get list of commands:
llm watsonx --help
Models
See all available models:
llm watsonx list-models
See all generation options:
llm watsonx list-model-options
Example
llm -m watsonx/meta-llama/llama-3-8b-instruct \
-o temperature .4 \
-o max_new_tokens 250 \
"What is IBM watsonx?"
Chat Example
llm chat -m watsonx/meta-llama/llama-3-8b-instruct \
-o max_new_tokens 1000 \
-s "You are chatbot assistant for a CLI (command line interface). Provide and help give users with unix commands to acieve their tasks"
Embeddings
See all available models:
llm watsonx list-embedding-models
Example
cat README.md | llm embed -m watsonx/ibm/slate-30m-english-rtrvr
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_watsonx-0.1.0.tar.gz
(7.8 kB
view hashes)
Built Distribution
Close
Hashes for llm_watsonx-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7dadcf1eab43ea719e0a88032df63ea901f3d7c75021ec884af5f0bcc2ad919f |
|
MD5 | d99f5df2c46133447f07610887fa12a1 |
|
BLAKE2b-256 | f2fa8eea39226b5698b0c26e7e6131b01eba12e5f8f0a9d0b0c2574ec421cb2e |