Skip to main content

LLM-based Chatbot

Project description

LLM-Chatbot

Installation

To get the Chatbot Python package by the following commands:

  1. PyPy
pip install llmchatbot
  1. GitHub repository
pip install git+https://github.com/egpivo/llmchatbot.git

Serving Automation

This repository automates the process of checking and fine-tuning pre-trained models for the Chatbot application. The automation script allows you to customize SpeechT5 and SWhisper models and enables retraining if needed.

Serving Process Flow

graph TD
  A[Check if Model Exists]
  B[Fine-Tune Model]
  C[Load BentoML Configuration]
  D[Serve the App]
  E[Check SSL Certificates]
  F[Generate Dummy SSL Certificates]

  A -- Yes --> C
  A -- No --> B
  B --> C
  C --> D
  D --> E
  E -- No --> F
  E -- Yes --> D

Artifact Folder

During the model serving process, the artifacts folder is used to store the BentoML artifacts, essential for serving the Chatbot application.

Usage

Local Model Serving

Default Model Values

Run the Chatbot service with default model values:

make local-serve

Customizing the Serving Process

Customize the Chatbot serving process using the automation script. Specify your desired models and options:

bash scripts/run_app_service.sh \
  --t5_pretrained_model {replace_with_actual_t5_model} \
  --t5_pretrained_vocoder {replace_with_actual_t5_vocoder} \
  --whisper_pretrained_model {replace_with_actual_whisper_model} \
  --is_retraining
  • Note: Replace {replace_with_actual_t5_model}, {replace_with_actual_t5_vocoder}, and {replace_with_actual_whisper_model} with your preferred values. Adding the --is_retraining flag forces model retraining.

Model Serving via Docker

By Makefile:

make docker-serve

By docker CLI

  • DockerHub
    docker run -p 443:443 egpivo/chatbot:latest
    
  • GitHub Package
    docker run -p 443:443 ghcr.io/egpivo/llmchatbot:latest
    

Client Side

Access the demo chatbot at https://{ip}/chatbot, with the default values being 0.0.0.0 for the ip.

  • Note: Dummy SSL certificates and keys are created by default for secure communication if key.pem and cert.pem do not exit in artifacts/. Or you can replace them manually.

Demo

  • Explore the demo site hosted on Alibaba Cloud via https://egpivo.com/chatbot/. demo.png

  • Note: This site is intended for demo purposes only, and there is no guarantee of computing efficiency.

Remark

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmchatbot-1.0.4.tar.gz (11.5 kB view hashes)

Uploaded Source

Built Distribution

llmchatbot-1.0.4-py3-none-any.whl (16.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page