Skip to main content

LLM plugin for models hosted by OpenRouter

Project description

llm-xai

LLM plugin to access xAI's models

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-xai

Configuration

You will need an API key from xAI. You can obtain one here.

You can set that as an environment variable called XAI_KEY, or add it to the llm set of saved keys using:

llm keys set xai
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

xAI: xAI/grok-beta
xAI: xAIcompletion/grok-beta
...

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm chat -m xAI/grok-beta

Enter your prompt, and have a chat:

Chatting with xAI/grok-beta
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> sup playa
Hey, what's up?
>

xAI offers a completion endpoint.

llm -m xAIcompletion/grok-beta "You must know this about me:"
 I’m not a fan of being alone. I have a hard time finding peace in the silence. My own thoughts drive me crazy. But I knew I had to do this for myself. I had to prove to myself that I could be alone and be okay with it
...

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set grok xAI/grok-beta

Now you can prompt Claude using:

cat llm_xai.py | llm -m grok-beta -s 'write some pytest tests for this'

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-xai
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Release history Release notifications | RSS feed

This version

0.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_xai-0.2.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

llm_xai-0.2-py3-none-any.whl (4.4 kB view details)

Uploaded Python 3

File details

Details for the file llm_xai-0.2.tar.gz.

File metadata

  • Download URL: llm_xai-0.2.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for llm_xai-0.2.tar.gz
Algorithm Hash digest
SHA256 3ff29ac518478994cdd1d731d00276ddace2e78f2ea6e792185a6f846817610f
MD5 2a9b2709e60fc0bad2b07c6c439eddb7
BLAKE2b-256 5979880bd549d97d772fdb58c14637450f41e96740e0e252fa1a1807f097c346

See more details on using hashes here.

File details

Details for the file llm_xai-0.2-py3-none-any.whl.

File metadata

  • Download URL: llm_xai-0.2-py3-none-any.whl
  • Upload date:
  • Size: 4.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for llm_xai-0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 da10217998467efe8db51e0f2fe4dc11f5e01b17bc2bd2eb9acbc2b4cc97c817
MD5 8b86ccf20bfd19b259758154745960b4
BLAKE2b-256 bac824a5d3830906f3e324023cfd9d5e9be09a86083744495165e87bb382144f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page