Skip to main content

Message-based LLM tools.

Project description

Chatstack

Minimalist Context Management for message-based GPTs

This Python code provides a chatbot implementation with context management using OpenAI's GPT-3.5-turbo or GPT-4 chat models. The chatbot maintains a conversation history and manages the context to ensure meaningful responses.

Dependencies

  • loguru
  • pydantic
  • openai
  • tiktoken

Classes

  • ChatRoleMessage: A base class for messages with role, text, and tokens.
  • SystemMessage: A message with the role 'system'.
  • ContextMessage: A message added to the model input context to provide context for the model.
  • AssistantMessage: A message with the role 'assistant'.
  • UserMessage: A message with the role 'user'.
  • ChatContext: A class that manages the conversation context and generates responses using OpenAI message interface models.

Usage

  1. Import the ChatContext class.
  2. Create an instance of the ChatContext class with the desired configuration.
  3. Call the user_message method with the user's message text to get a response from the chatbot.

Example:

from chatstack import ChatContext

BASE_SYSTEM_PROMPT  = "You are a clever bot.  Do not apologize, or make excuses.  "
BASE_SYSTEM_PROMPT += "Do not mention that you are an AI language model since that is annoying to users."

def main():
    chat_context = ChatContext(base_system_msg_text=BASE_SYSTEM_PROMPT)

    print("Welcome to the Chatbot!")
    
    while True:
        user_input = input("You: ")        
        response = chat_context.user_message(user_input, stream=True)
        print("Chatbot:", response)

if __name__ == "__main__":
    main()

Configuration

The ChatContext class accepts the following parameters:

  • min_response_tokens: Minimum number of tokens to reserve for model completion response.
  • max_response_tokens: Maximum number of tokens to allow for model completion response.
  • max_context_assistant_messages: Number of recent assistant messages to keep in context.
  • max_context_user_messages: Number of recent user messages to keep in context.
  • model: The name of the GPT model to use (default: "gpt-3.5-turbo").
  • temperature: The temperature for the model's response generation.
  • base_system_msg_text: The base system message text to provide context for the model.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatstack-0.0.3.tar.gz (8.5 kB view details)

Uploaded Source

Built Distribution

chatstack-0.0.3-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file chatstack-0.0.3.tar.gz.

File metadata

  • Download URL: chatstack-0.0.3.tar.gz
  • Upload date:
  • Size: 8.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.16

File hashes

Hashes for chatstack-0.0.3.tar.gz
Algorithm Hash digest
SHA256 11047963d18496cdc5831baed4750e563d05a94ee5843a69d11a1824ed185643
MD5 6fefdf39537bc83cbf04d592ec370eb2
BLAKE2b-256 8cb4270a1ea1e8b95a96647d1a0a95375e43dcb82a4340102763d0d34ed0d77f

See more details on using hashes here.

File details

Details for the file chatstack-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: chatstack-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.16

File hashes

Hashes for chatstack-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e01892effc42a6100c89dafec8bf248ef900c559f2d2a342f7574dce375be416
MD5 47178e51982cb7bbcdaeefb02f7f2177
BLAKE2b-256 e436de62b07e0fbf9eeb60a3001fe66352ed79e5fd5305087b7993103c8b488c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page