Skip to main content

AI Smart Human Assistant Library

Project description

AISHA Lib: A High-Level Abstraction for Building AI Assistants

In the evolving landscape of artificial intelligence, the development of smart assistants has become increasingly prevalent. To streamline this process, the AISHA (AI Smart Human Assistant) Lib offers a high-level abstraction designed for creating AI assistants. This versatile library supports various large language models (LLMs) and different LLM backends, providing developers with a powerful and flexible toolset.

Environment

To create a Python virtual environment, use the command:

conda env create -f environment.yml

Installation

pip install aishalib

Supported Models

The following LLM models are supported:

  • microsoft/Phi-3-medium-128k-instruct
  • CohereForAI/c4ai-command-r-v01
  • google/gemma-2-27b-it
  • Qwen/Qwen2-72B-Instruct

LLM backends

The following LLM backends are supported:

  • Llama.cpp Server API

Telegram bot example

import os

from aishalib.aishalib import Aisha
from aishalib.llmbackend import LlamaCppBackend
from aishalib.tools import parseToolResponse
from aishalib.utils import get_time_string
from aishalib.memory import SimpleMemory
from telegram import Update
from telegram.ext import Application, MessageHandler, ContextTypes, filters


BOT_NAME = os.environ['BOT_NAME']
TG_TOKEN = os.environ['TG_TOKEN']

PERSISTENCE_DIR = BOT_NAME + "/"

if not os.path.exists(PERSISTENCE_DIR):
    os.makedirs(PERSISTENCE_DIR)

memory = SimpleMemory(PERSISTENCE_DIR + "memory.json")


def get_aisha(aisha_context_key, tg_context):
    if aisha_context_key not in tg_context.user_data:
        backend = LlamaCppBackend("http://127.0.0.1:8088/completion", max_predict=256)
        aisha = Aisha(backend, "google/gemma-2-27b-it", prompt_file="system_prompt_example.txt", max_context=8192)
        tg_context.user_data[aisha_context_key] = aisha
    aisha = tg_context.user_data[aisha_context_key]
    aisha.load_context(aisha_context_key)
    return aisha


async def process_message(update: Update, context: ContextTypes.DEFAULT_TYPE):
    chat_id = update.effective_chat.id
    user_id = str(update.message.from_user.id)
    user_name = memory.get_memory_value("names:" + user_id, "")
    computed_name = user_name if user_name else f"id_{user_id}"
    message = update.message.text

    aisha = get_aisha(PERSISTENCE_DIR + str(chat_id), context)
    aisha.add_user_request(f"{computed_name}: {message}", meta_info=get_time_string())
    tools_response = aisha.completion(temp=0.7, top_p=0.9)
    aisha.save_context(PERSISTENCE_DIR + str(chat_id))

    tools = parseToolResponse(tools_response, ["directly_answer", "save_human_name", "pass"])

    if "save_human_name" in tools:
        user_name = tools["save_human_name"]
        memory.save_memory_value("names:" + user_name.split(":")[0].replace("id_", ""), user_name.split(":")[1])

    if "pass" not in tools:
        await context.bot.send_message(chat_id=chat_id,
                                       text=tools["directly_answer"],
                                       reply_to_message_id=update.message.message_id)


application = Application.builder().token(TG_TOKEN).build()
application.add_handler(MessageHandler(filters.TEXT & ~filters.COMMAND, process_message))
application.run_polling()

Run Llama.CPP Server backend

llama.cpp/build/bin/llama-server -m model_q5_k_m.gguf -ngl 99 -fa -c 4096 --host 0.0.0.0 --port 8000

Install CUDA toolkit for Llama.cpp compilation

Please note that the toolkit version must match the driver version. The driver version can be found using the nvidia-smi command. To install toolkit for CUDA 12.5 you need to run the following commands:

CUDA_TOOLKIT_VERSION=12-5
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb
sudo dpkg -i cuda-keyring_1.1-1_all.deb
sudo apt update
sudo apt -y install cuda-toolkit-${CUDA_TOOLKIT_VERSION}
echo -e '
export CUDA_HOME=/usr/local/cuda
export PATH=${CUDA_HOME}/bin:${PATH}
export LD_LIBRARY_PATH=${CUDA_HOME}/lib64:$LD_LIBRARY_PATH
' >> ~/.bashrc

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aishalib-0.0.24.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

aishalib-0.0.24-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file aishalib-0.0.24.tar.gz.

File metadata

  • Download URL: aishalib-0.0.24.tar.gz
  • Upload date:
  • Size: 9.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for aishalib-0.0.24.tar.gz
Algorithm Hash digest
SHA256 d4e2b7bb9826f059d3bb66b3bf1aae0adb9652c392ffbcabc4c3769ef715fbc4
MD5 f8987cdb252d645592378e25ac09c4fa
BLAKE2b-256 f7e3889f3fd74397b04da2565ea893fb9238475bbee025c7277b0859edd04b83

See more details on using hashes here.

File details

Details for the file aishalib-0.0.24-py3-none-any.whl.

File metadata

  • Download URL: aishalib-0.0.24-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for aishalib-0.0.24-py3-none-any.whl
Algorithm Hash digest
SHA256 e5e4240b2f61a246324d2a8334413abb2f4dd5721a6c9e890123098300936952
MD5 8c86c0baa8d5d93b9d0f199025e3f6e3
BLAKE2b-256 ae861cc51bcbb857ddb251be624b77e26f36d6e2da5557dac5d795ec2d719741

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page