Skip to main content

Sopel AI - an LLM enhanced chat bot plug-in

Project description

% sopel_ai(1) Version 1.3.1 chatbot plugin

Name

Sopel AI - AI query/response plugin

Synopsis

Enable Sopel to respond to queries using a Perplexity AI back-end, featuring the ability to plug different LLMs on a per-user basis.

pip install -U sopel_ai

sopel configure
sopel

From a channel where Sopel AI is present enter a query:

.q Summarize the plot of The Martian by Andy Weir.

This plugin requires an API key issued by the service provider.

Installation

pip install -U sopel_ai

The installation assumes that the Sopel chatbot is already installed in the target system and in the same environment as the pip installation.

Confirm the installed package and version:

echo "import sopel_ai ; print(sopel_ai.__VERSION__)" | python

Commands

Listed in order of frequency of use:

Command Arguments Effect
.q Some question The model produces a response
.qpm Some question Same as .q but in private message
.models n/a Lists all models that Sopel AI supports
.mymodel number Request or set the model to use for the current /nick
.req n/a Return the GitHub URL for Sopel AI feature requests
.bug n/a Same as .req

Other available commands if the standard Sopen infobot plugins are enabled:

Command Arguments Effect
.search Some question Search using Bing or DuckDuckGo
.dict Word Get a dictionary definition if one is available
.tr Word or phrase Translate to English
.w Word or topic Search Wikipedia for articles

Usage

The most common usage is to enter a query in-channel or private message, and wait for the bot to respond.

.q Quote the Three Law of Robotics as a list and without details.

Users may check the current model used for producing their responses by issuing:

.mymodel

The bot produces a numbered list of supported models by issuing:

.models

Users are welcome to change the default model to one of those listed by issuing the .mymodel command followed by the item number for the desired model from the list:

.mymodel 1

Users may request private instead of in-channel responses:

.qpm Quote the Three Laws of Robotics and give me examples.

Responses generated by making .q queries are expected to be short or are trunked at 480 characters. They are intended to appear in-channel and to be as brief as possible.

Responses generated from a .qpm query are expected to be long and detailed, with a 16 KB length limit, span multpile messages (due to ircv3 limitations), and sopel_ai presents them to the user in private message, regardless of whether they were issued from a channel or a direct message.

Users can query the bot plugin and AI provider using:

.mver

AI providers

The current version uses the Perplexity AI models and API. Future versions may support other providers.

API Key

All AI services providers require an API key for access. The API key is configured via:

sopel config

Or edit this section in the Sopel configuration file:

[sopel_ai]
.
.
llm_key = pplx-3a45enteryourkeykere

Docker

Sopel AI is dockerized and available from Docker Hub as pr3d4t0r/sopel_ai. The version tag is the same as the latest version number for Sopel AI.

The examples in this section assume execution from the local file system. Adapt as needed to run in a Kubernets cluster or other deployment method.

First time

The Sopel + AI configuration file must be created:

docker run -ti -v ${HOME}/sopel_ai_data:/home/sopel_ai \
    pr3d4t0r/sopel_ai:latest \
    sopel configure

The API key and other relevant configuration data must be provided at this time. $HOME/sopel_ai_data is volume mapped to the container's `/home/sopel_ai/.sopel directory. Ensure that your host has write permissions in the shared volume.

The pr3d4t0r/sopel_ai:latest image is used if no version is specified. The image update policy is left to the sysops and is not automatic.

Once $HOME/sopel_ai_data exists it's possible to copy the contents of a different ~/.sopel directory to it and use is as the configuration and Sopel AI database store.

Starting Sopel AI

A Docker Compose file is provided as an example of how to start the service, docker-file.yaml. With this Docker Compose file in the current directory, start the service with:

docker-compose up [-d] sopel_ai

The -d parameter daemonizes the service. Without it, the service will start and display its output in the current console.

License

The Sopel AI Sopel plugin, package, documentation, and examples are licensed under the BSD-3 open source license at https://github.com/pr3d4t0r/sopel_ai/blob/master/LICENSE.txt.

See also

Bugs

Feature requests and bug reports:

https://github.com/pr3d4t0r/sopel_ai/issues

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

sopel_ai-1.3.1-py3-none-any.whl (11.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page