Skip to main content

A Python package for managing LLM chat conversation history

Project description

LLM Dialog Manager

A Python package for managing AI chat conversation history with support for multiple LLM providers (OpenAI, Anthropic, Google, X.AI) and convenient conversation management features.

Features

  • Support for multiple AI providers:
    • OpenAI (GPT-3.5, GPT-4)
    • Anthropic (Claude)
    • Google (Gemini)
    • X.AI (Grok)
  • Intelligent message role management (system, user, assistant)
  • Conversation history tracking and validation
  • Load balancing across multiple API keys
  • Error handling and retry mechanisms
  • Conversation saving and loading
  • Memory management options
  • Conversation search and indexing
  • Rich conversation display options

Installation

pip install llm-dialog-manager

Quick Start

Basic Usage

from llm_dialog_manager import ChatHistory

# Initialize with a system message
history = ChatHistory("You are a helpful assistant")

# Add messages
history.add_user_message("Hello!")
history.add_assistant_message("Hi there! How can I help you today?")

# Print conversation
print(history)

Using the AI Agent

from llm_dialog_manager import Agent

# Initialize an agent with a specific model
agent = Agent("claude-2.1", memory_enabled=True)

# Add messages and generate responses
agent.add_message("system", "You are a helpful assistant")
agent.add_message("user", "What is the capital of France?")
response = agent.generate_response()

# Save conversation
agent.save_conversation()

Advanced Features

Managing Multiple API Keys

from llm_dialog_manager import Agent

# Use specific API key
agent = Agent("gpt-4", api_key="your-api-key")

# Or use environment variables
# OPENAI_API_KEY_1=key1
# OPENAI_API_KEY_2=key2
# The system will automatically handle load balancing

Conversation Management

from llm_dialog_manager import ChatHistory

history = ChatHistory()

# Add messages with role validation
history.add_message("Hello system", "system")
history.add_message("Hello user", "user")
history.add_message("Hello assistant", "assistant")

# Search conversations
results = history.search_for_keyword("hello")

# Get conversation status
status = history.conversation_status()
history.display_conversation_status()

# Get conversation snippets
snippet = history.get_conversation_snippet(1)
history.display_snippet(1)

Environment Variables

Create a .env file in your project root:

# OpenAI
OPENAI_API_KEY_1=your-key-1
OPENAI_API_BASE_1=https://api.openai.com/v1

# Anthropic
ANTHROPIC_API_KEY_1=your-anthropic-key
ANTHROPIC_API_BASE_1=https://api.anthropic.com

# Google
GEMINI_API_KEY=your-gemini-key

# X.AI
XAI_API_KEY=your-x-key

Development

Running Tests

pytest tests/

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For support, please open an issue in the GitHub repository or contact the maintainers.

Project details


Release history Release notifications | RSS feed

This version

0.1.6

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_dialog_manager-0.1.6.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

llm_dialog_manager-0.1.6-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file llm_dialog_manager-0.1.6.tar.gz.

File metadata

  • Download URL: llm_dialog_manager-0.1.6.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for llm_dialog_manager-0.1.6.tar.gz
Algorithm Hash digest
SHA256 c594616c4e1660304ca1785bc8ceb22b29d941eb68f158020cd25b6f86738e3f
MD5 c8b05d3c18ce093ed6abf6dc11dbed61
BLAKE2b-256 2e1c177d0bfb0d4ba40b571ca16b19cb8720258d1a803b84bccc6e233bc3212e

See more details on using hashes here.

File details

Details for the file llm_dialog_manager-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_dialog_manager-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 f3fa83a65b832e0455d3c75e7c46ae2aa2568ca5d7c87ad44055596ed192e41f
MD5 46a15d74741b98f002494cb6ef79e139
BLAKE2b-256 261d0d98bb82339bfcbff04a50e348d010db6d3b8f20b1dfc0a5b81c378b2405

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page