Skip to main content

A Python package for managing LLM chat conversation history

Project description

LLM Dialog Manager

A Python package for managing AI chat conversation history with support for multiple LLM providers (OpenAI, Anthropic, Google, X.AI) and convenient conversation management features.

Features

  • Support for multiple AI providers:
    • OpenAI (GPT-3.5, GPT-4)
    • Anthropic (Claude)
    • Google (Gemini)
    • X.AI (Grok)
  • Intelligent message role management (system, user, assistant)
  • Conversation history tracking and validation
  • Load balancing across multiple API keys
  • Error handling and retry mechanisms
  • Conversation saving and loading
  • Memory management options
  • Conversation search and indexing
  • Rich conversation display options
  • Vision & Json Output enabled [20240111]

Installation

pip install llm-dialog-manager

Quick Start

Environment Variables

Create a .env file in your project root:

# OpenAI
OPENAI_API_KEY_1=your-key-1
OPENAI_API_BASE_1=https://api.openai.com/v1

# Anthropic
ANTHROPIC_API_KEY_1=your-anthropic-key
ANTHROPIC_API_BASE_1=https://api.anthropic.com

# Google
GEMINI_API_KEY=your-gemini-key

# X.AI
XAI_API_KEY=your-x-key

Basic Usage

from llm_dialog_manager import Agent

# Initialize an agent with a specific model
agent = Agent("ep-20250319212209-j6tfj-openai", memory_enabled=True)

# Add messages and generate responses
agent.add_message("system", "You are a helpful assistant")
agent.add_message("user", "What is the capital of France?")
response = agent.generate_response()

# Save conversation
agent.save_conversation()

Setup Debugging Console

python app.py
# open localhost:8000

https://github.com/user-attachments/assets/5f640029-24e6-44ea-a3a3-02eb3de0d4df

Development

Running Tests

pytest tests/

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For support, please open an issue in the GitHub repository or contact the maintainers.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_dialog_manager-0.5.0.tar.gz (11.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_dialog_manager-0.5.0-py3-none-any.whl (9.8 kB view details)

Uploaded Python 3

File details

Details for the file llm_dialog_manager-0.5.0.tar.gz.

File metadata

  • Download URL: llm_dialog_manager-0.5.0.tar.gz
  • Upload date:
  • Size: 11.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for llm_dialog_manager-0.5.0.tar.gz
Algorithm Hash digest
SHA256 b6a71801b351ef24a5b9beca44a8a0acc144840d989f5e1be451e3ddc2b878ba
MD5 2f48b0c48e68fb116a061f194d07bcd6
BLAKE2b-256 5c51545b412b6d39bae97489d7710d8e4fe5d41a0d5e724426d2aef85a15c875

See more details on using hashes here.

File details

Details for the file llm_dialog_manager-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_dialog_manager-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0baaa0b69f957a4994cc7280726f095fc25c0383f1c1b6f86a68231999d7bc6e
MD5 018334b06a1dff4681888b7e3abf39ae
BLAKE2b-256 0b61394cb7cb37b5a9aac9c10d733d504cdd61f4360e9719041cb9d600fab9ce

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page