Sopel AI - an LLM enhanced chat bot plug-in
Project description
% sopel_ai(1) Version 1.3.1 chatbot plugin
Name
Sopel AI - AI query/response plugin
Synopsis
Enable Sopel to respond to queries using a Perplexity AI back-end, featuring the ability to plug different LLMs on a per-user basis.
pip install -U sopel_ai
sopel configure
sopel
From a channel where Sopel AI is present enter a query:
.q Summarize the plot of The Martian by Andy Weir.
This plugin requires an API key issued by the service provider.
Installation
pip install -U sopel_ai
The installation assumes that the Sopel chatbot is already installed in the
target system and in the same environment as the pip
installation.
Confirm the installed package and version:
echo "import sopel_ai ; print(sopel_ai.__VERSION__)" | python
Commands
Listed in order of frequency of use:
Command | Arguments | Effect |
---|---|---|
.q |
Some question | The model produces a response |
.qpm |
Some question | Same as .q but in private message |
.models |
n/a | Lists all models that Sopel AI supports |
.mymodel |
number | Request or set the model to use for the current /nick |
.req |
n/a | Return the GitHub URL for Sopel AI feature requests |
.bug |
n/a | Same as .req |
Other available commands if the standard Sopen infobot plugins are enabled:
Command | Arguments | Effect |
---|---|---|
.search |
Some question | Search using Bing or DuckDuckGo |
.dict |
Word | Get a dictionary definition if one is available |
.tr |
Word or phrase | Translate to English |
.w |
Word or topic | Search Wikipedia for articles |
Usage
The most common usage is to enter a query in-channel or private message, and wait for the bot to respond.
.q Quote the Three Law of Robotics as a list and without details.
Users may check the current model used for producing their responses by issuing:
.mymodel
The bot produces a numbered list of supported models by issuing:
.models
Users are welcome to change the default model to one of those listed by issuing
the .mymodel
command followed by the item number for the desired model from
the list:
.mymodel 1
Users may request private instead of in-channel responses:
.qpm Quote the Three Laws of Robotics and give me examples.
Responses generated by making .q
queries are expected to be short or are
trunked at 480 characters. They are intended to appear in-channel and to be as
brief as possible.
Responses generated from a .qpm
query are expected to be long and detailed,
with a 16 KB length limit, span multpile messages (due to ircv3 limitations),
and sopel_ai
presents them to the user in private message, regardless of
whether they were issued from a channel or a direct message.
Users can query the bot plugin and AI provider using:
.mver
AI providers
The current version uses the Perplexity AI models and API. Future versions may support other providers.
API Key
All AI services providers require an API key for access. The API key is configured via:
sopel config
Or edit this section in the Sopel configuration file:
[sopel_ai]
.
.
llm_key = pplx-3a45enteryourkeykere
Docker
Sopel AI is dockerized and available from Docker Hub as pr3d4t0r/sopel_ai. The version tag is the same as the latest version number for Sopel AI.
The examples in this section assume execution from the local file system. Adapt as needed to run in a Kubernets cluster or other deployment method.
First time
The Sopel + AI configuration file must be created:
docker run -ti -v ${HOME}/sopel_ai_data:/home/sopel_ai \
pr3d4t0r/sopel_ai:latest \
sopel configure
The API key and other relevant configuration data must be provided at this time.
$HOME/sopel_ai_data
is volume mapped to the container's `/home/sopel_ai/.sopel
directory. Ensure that your host has write permissions in the shared volume.
The pr3d4t0r/sopel_ai:latest
image is used if no version is specified. The
image update policy is left to the sysops and is not automatic.
Once $HOME/sopel_ai_data
exists it's possible to copy the contents of a
different ~/.sopel
directory to it and use is as the configuration and Sopel
AI database store.
Starting Sopel AI
A Docker Compose file is provided as an example of how to start the service, docker-file.yaml. With this Docker Compose file in the current directory, start the service with:
docker-compose up [-d] sopel_ai
The -d
parameter daemonizes the service. Without it, the service will start
and display its output in the current console.
License
The Sopel AI Sopel plugin, package, documentation, and examples are licensed under the BSD-3 open source license at https://github.com/pr3d4t0r/sopel_ai/blob/master/LICENSE.txt.
See also
- Sopel AI API documentation at https://pr3d4t0r.github.io/sopel_ai
- PerplexiPy high level API interface to Perplexity AI https://pypi.org/project/perplexipy
- Sopel commands: https://sopel.chat/usage/commands/
- Sopel bot home page: https://sopel.chat/
Bugs
Feature requests and bug reports:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file sopel_ai-1.3.1-py3-none-any.whl
.
File metadata
- Download URL: sopel_ai-1.3.1-py3-none-any.whl
- Upload date:
- Size: 11.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 46c5a712cabed697591f01a6a9f9d3e346379dc46da0173d2b529f35c834e075 |
|
MD5 | d4d4e76e1e98a1c56d385082c85543d9 |
|
BLAKE2b-256 | 3076f58bb75e91a75435e4b1698a01cb8047985d55e3ab329dd0563983bfb7d6 |