Skip to main content

AI Smart Human Assistant Library

Project description

AISHA Lib: A High-Level Abstraction for Building AI Assistants

In the evolving landscape of artificial intelligence, the development of smart assistants has become increasingly prevalent. To streamline this process, the AISHA (AI Smart Human Assistant) Lib offers a high-level abstraction designed for creating AI assistants. This versatile library supports various large language models (LLMs) and different LLM backends, providing developers with a powerful and flexible toolset.

Environment

To create a Python virtual environment, use the command:

conda env create -f environment.yml

Installation

pip install aishalib

Supported Models

The following LLM models are supported:

  • Phi-3-medium-128k-instruct
  • c4ai-command-r-v01

LLM backends

The following LLM backends are supported:

  • Llama.cpp Server API

Telegram bot example

from aishalib.aishalib import Aisha
from telegram import Update
from telegram.ext import Application, MessageHandler, ContextTypes, filters

TG_TOKEN = "YOUR_TG_TOKEN"

SYSTEM_PROMPT = """
Ты умный бот помощник для общения в телеграме.
Ты общаешься в групповом чате с другими пользователями.
Ты отвечаешь на русском языке.
"""

SYSTEM_INJECTION = """
Последнее сообщение написал пользователь с идентификатором {user_id}.
Используй эти идентификаторы для того, чтобы различать пользователей.
Запрещено обращаться к пользователю по его идентификатору! Можно только по имени.
Если пользователь не представился спроси как его зовут.
Если исходя из контекста и смысла беседы это сообщение адресовано тебе или это общее сообщение для всех в чате то ты обязан на него ответить.
Если это сообщение адресовано другому пользователю, то напиши специальную команду "ignoring_message" в ответе.
"""

def get_aisha(chat_id, tg_context):
    if chat_id not in tg_context.user_data:
        aisha = Aisha("http://127.0.0.1:8000/completion",
                      "CohereForAI/c4ai-command-r-v01",
                      prompt=SYSTEM_PROMPT,
                      max_context=8192,
                      max_predict=512)
        tg_context.user_data[chat_id] = aisha
    aisha = tg_context.user_data[chat_id]
    aisha.load_context(chat_id)
    return aisha

async def process_message(update: Update, context: ContextTypes.DEFAULT_TYPE):
    chat_id = update.effective_chat.id
    user_id = str(update.message.from_user.id)
    aisha = get_aisha(str(chat_id), context)
    aisha.add_user_request(update.message.text,
                           system_injection=SYSTEM_INJECTION.replace("{user_id}", user_id))
    text_response = aisha.completion(temp=0.0, top_p=0.5)
    aisha.save_context(chat_id)
    if "ignoring_message" not in text_response:
        await context.bot.send_message(chat_id=chat_id,
                                       text=text_response,
                                       reply_to_message_id=update.message.message_id)

application = Application.builder().token(TG_TOKEN).build()
application.add_handler(MessageHandler(filters.TEXT & ~filters.COMMAND, process_message))
application.run_polling()

Chainlit example

import chainlit as cl
from aishalib.aishalib import Aisha

@cl.on_chat_start
async def on_chat_start():
    aisha = Aisha("http://127.0.0.1:8000/completion",
                  "CohereForAI/c4ai-command-r-v01",
                  prompt="Ты отвечаешь на русском языке.",
                  max_context=4096,
                  max_predict=512)
    cl.user_session.set("aisha", aisha)

@cl.on_message
async def on_message(input_msg: cl.Message):
    output_msg = cl.Message(content="")
    await output_msg.send()
    aisha = cl.user_session.get("aisha")
    aisha.add_user_request(input_msg.content)
    response = await cl.make_async(aisha.completion)(temp=0.5, top_p=0.5)
    output_msg.content = response
    await output_msg.update()

Document search example

from aisha import Aisha

MODEL_ID = "microsoft/Phi-3-medium-128k-instruct"
COMPLETION_URL = "http://127.0.0.1:8000/completion"

with open("document.txt") as f:
    text = f.read()

prompt = f"""Ты - поисковая система.
Ниже представлен следующий текст по которому необходимо выполнять поиск.
## Текст:
{text}
"""

aisha = Aisha(MODEL_ID, COMPLETION_URL, prompt=prompt, max_context=40000, max_predict=8192)
user_request = "Вопрос по документу"
prompt = f"Верни релевантные фрагменты текста, которые наиболее точно соответствуют запросу пользователя: {user_request}."
aisha.add_user_request(prompt)
print(aisha.completion(temp=0.0, top_p=0.0))

Run Llama.CPP Server backend

llama.cpp/build/bin/server -m model_q5_k_m.gguf -ngl 99 -fa -c 4096 --host 0.0.0.0 --port 8000

Install CUDA toolkit for Llama.cpp compilation

Please note that the toolkit version must match the driver version. The driver version can be found using the nvidia-smi command. Аor example, to install toolkit for CUDA 12.2 you need to run the following commands:

CUDA_TOOLKIT_VERSION=12-2
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb
sudo dpkg -i cuda-keyring_1.1-1_all.deb
sudo apt update
sudo apt -y install cuda-toolkit-${CUDA_TOOLKIT_VERSION}
echo -e '
export CUDA_HOME=/usr/local/cuda
export PATH=${CUDA_HOME}/bin:${PATH}
export LD_LIBRARY_PATH=${CUDA_HOME}/lib64:$LD_LIBRARY_PATH
' >> ~/.bashrc

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aishalib-0.0.7.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

aishalib-0.0.7-py3-none-any.whl (6.4 kB view details)

Uploaded Python 3

File details

Details for the file aishalib-0.0.7.tar.gz.

File metadata

  • Download URL: aishalib-0.0.7.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for aishalib-0.0.7.tar.gz
Algorithm Hash digest
SHA256 8c7fcd75b15c9c7e1c2618fed315dd2af9552b0d6cac2b9a68af1618b0930477
MD5 8b91e5ff8a3f1eab805c2b2f47841e01
BLAKE2b-256 bb101a04c466248f4e6096137a98029f3ab358b745fa7c21aa581b5a391f6d0d

See more details on using hashes here.

File details

Details for the file aishalib-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: aishalib-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 6.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for aishalib-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 c8ba76a4c08d612a6d1803761fd1f6f9d624610b3c8189de8c729e3070c2e1a0
MD5 d3035c97c7b47620e8a27eb9d933143b
BLAKE2b-256 e173afe8fe5dd2b2415c9d3de0bb869b070d8c0d8448091b99a4e91587998756

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page