AI Smart Human Assistant Library
Project description
AISHA Lib: A High-Level Abstraction for Building AI Assistants
In the evolving landscape of artificial intelligence, the development of smart assistants has become increasingly prevalent. To streamline this process, the AISHA (AI Smart Human Assistant) Lib offers a high-level abstraction designed for creating AI assistants. This versatile library supports various large language models (LLMs) and different LLM backends, providing developers with a powerful and flexible toolset.
Environment
To create a Python virtual environment, use the command:
conda env create -f environment.yml
Installation
pip install aishalib
Supported Models
The following LLM models are supported:
- Phi-3-medium-128k-instruct
- c4ai-command-r-v01
LLM backends
The following LLM backends are supported:
- Llama.cpp Server API
Telegram bot example
from aishalib.aishalib import Aisha
from telegram import Update
from telegram.ext import Application, MessageHandler, ContextTypes, filters
TG_TOKEN = "YOUR_TG_TOKEN"
SYSTEM_PROMPT = """
Ты умный бот помощник для общения в телеграме.
Ты общаешься в групповом чате с другими пользователями.
Ты отвечаешь на русском языке.
"""
SYSTEM_INJECTION = """
Последнее сообщение написал пользователь с идентификатором {user_id}.
Используй эти идентификаторы для того, чтобы различать пользователей.
Запрещено обращаться к пользователю по его идентификатору! Можно только по имени.
Если пользователь не представился спроси как его зовут.
Если исходя из контекста и смысла беседы это сообщение адресовано тебе или это общее сообщение для всех в чате то ты обязан на него ответить.
Если это сообщение адресовано другому пользователю, то напиши специальную команду "ignoring_message" в ответе.
"""
def get_aisha(chat_id, tg_context):
if chat_id not in tg_context.user_data:
aisha = Aisha("http://127.0.0.1:8000/completion",
"CohereForAI/c4ai-command-r-v01",
prompt=SYSTEM_PROMPT,
max_context=8192,
max_predict=512)
tg_context.user_data[chat_id] = aisha
aisha = tg_context.user_data[chat_id]
aisha.load_context(chat_id)
return aisha
async def process_message(update: Update, context: ContextTypes.DEFAULT_TYPE):
chat_id = update.effective_chat.id
user_id = str(update.message.from_user.id)
aisha = get_aisha(str(chat_id), context)
aisha.add_user_request(update.message.text,
system_injection=SYSTEM_INJECTION.replace("{user_id}", user_id))
text_response = aisha.completion(temp=0.0, top_p=0.5)
aisha.save_context(chat_id)
if "ignoring_message" not in text_response:
await context.bot.send_message(chat_id=chat_id,
text=text_response,
reply_to_message_id=update.message.message_id)
application = Application.builder().token(TG_TOKEN).build()
application.add_handler(MessageHandler(filters.TEXT & ~filters.COMMAND, process_message))
application.run_polling()
Chainlit example
import chainlit as cl
from aishalib.aishalib import Aisha
@cl.on_chat_start
async def on_chat_start():
aisha = Aisha("http://127.0.0.1:8000/completion",
"CohereForAI/c4ai-command-r-v01",
prompt="Ты отвечаешь на русском языке.",
max_context=4096,
max_predict=512)
cl.user_session.set("aisha", aisha)
@cl.on_message
async def on_message(input_msg: cl.Message):
output_msg = cl.Message(content="")
await output_msg.send()
aisha = cl.user_session.get("aisha")
aisha.add_user_request(input_msg.content)
response = await cl.make_async(aisha.completion)(temp=0.5, top_p=0.5)
output_msg.content = response
await output_msg.update()
Document search example
from aisha import Aisha
MODEL_ID = "microsoft/Phi-3-medium-128k-instruct"
COMPLETION_URL = "http://127.0.0.1:8000/completion"
with open("document.txt") as f:
text = f.read()
prompt = f"""Ты - поисковая система.
Ниже представлен следующий текст по которому необходимо выполнять поиск.
## Текст:
{text}
"""
aisha = Aisha(MODEL_ID, COMPLETION_URL, prompt=prompt, max_context=40000, max_predict=8192)
user_request = "Вопрос по документу"
prompt = f"Верни релевантные фрагменты текста, которые наиболее точно соответствуют запросу пользователя: {user_request}."
aisha.add_user_request(prompt)
print(aisha.completion(temp=0.0, top_p=0.0))
Run Llama.CPP Server backend
llama.cpp/build/bin/server -m model_q5_k_m.gguf -ngl 99 -fa -c 4096 --host 0.0.0.0 --port 8000
Install CUDA toolkit for Llama.cpp compilation
Please note that the toolkit version must match the driver version. The driver version can be found using the nvidia-smi command. Аor example, to install toolkit for CUDA 12.2 you need to run the following commands:
CUDA_TOOLKIT_VERSION=12-2
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb
sudo dpkg -i cuda-keyring_1.1-1_all.deb
sudo apt update
sudo apt -y install cuda-toolkit-${CUDA_TOOLKIT_VERSION}
echo -e '
export CUDA_HOME=/usr/local/cuda
export PATH=${CUDA_HOME}/bin:${PATH}
export LD_LIBRARY_PATH=${CUDA_HOME}/lib64:$LD_LIBRARY_PATH
' >> ~/.bashrc
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aishalib-0.0.6.tar.gz
.
File metadata
- Download URL: aishalib-0.0.6.tar.gz
- Upload date:
- Size: 6.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 93cf4c13a5fb352b9f4da8fb3cb03c228d4c183612427143632e9f0e8def9682 |
|
MD5 | 1f44811660bd657b32352d0449667416 |
|
BLAKE2b-256 | fab76faf117a59fe8003db6fd43ef541be0d3c6bed18bbed7c1a149f598eda30 |
File details
Details for the file aishalib-0.0.6-py3-none-any.whl
.
File metadata
- Download URL: aishalib-0.0.6-py3-none-any.whl
- Upload date:
- Size: 6.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4e273400fb1dd24b4cfb6f09666ea28a5f70943266646f9115c9a625b19678ec |
|
MD5 | b063c057a70f0e4d2475f7b4ba077bde |
|
BLAKE2b-256 | 88508655ada8c9510178f4c3115147b11c71d7f2806668612f1c7bab7dea480c |