Skip to main content

ai powered chatapp for browsing and search

Project description

ChatLite 🤖

A lightweight, extensible chat application framework for building AI-powered chat interfaces. ChatLite provides an easy-to-use platform for integrating various language models with web-based chat applications.

✨ Features

  • 🔄 Real-time WebSocket communication
  • 🎯 Multi-model support (Llama, Qwen, etc.)
  • 🌐 Web search integration
  • 🎨 Customizable UI with modern design
  • 🔌 Plugin architecture for easy extensions
  • 💬 Chat history management
  • 🎭 Multiple agent types support
  • 📱 Responsive design

🚀 Quick Start

Installation

pip install chatlite

Basic Usage

import chatlite

# Start a simple chat server with Llama 3.2
chatlite.local_llama3p2()

# Or use Qwen 2.5
chatlite.local_qwen2p5()

# Custom configuration
server = chatlite.create_server(
    model_type="local",
    model_name="llama3.2:latest",
    temperature=0.7,
    max_tokens=4000
)
server.run()

Pre-configured Models

ChatLite comes with several pre-configured models:

# Use different models directly
from chatlite import mistral_7b_v3, mixtral_8x7b, qwen_72b

# Start Mistral 7B server
mistral_7b_v3()

# Start Mixtral 8x7B server
mixtral_8x7b()

# Start Qwen 72B server
qwen_72b()

💻 Frontend Integration

ChatLite includes a Flutter-based frontend that can be easily customized. Here's a basic example of connecting to the ChatLite server:

final channel = WebSocketChannel.connect(
  Uri.parse('ws://localhost:8143/ws/$clientId'),
);

// Send message
channel.sink.add(json.encode({
  'message': 'Hello!',
  'model': 'llama3.2:latest',
  'system_prompt': 'You are a helpful assistant',
  'agent_type': 'WebSearchAgent',
  'is_websearch_chat': true
}));

// Listen for responses
channel.stream.listen(
  (message) {
    final data = jsonDecode(message);
    if (data['type'] == 'stream') {
      print(data['message']);
    }
  },
  onError: (error) => print('Error: $error'),
  onDone: () => print('Connection closed'),
);

🔧 Configuration

ChatLite supports various configuration options:

from chatlite import create_server

server = create_server(
    model_type="local",          # local, huggingface, etc.
    model_name="llama3.2:latest",
    api_key="your-api-key",      # if needed
    temperature=0.7,             # model temperature
    max_tokens=4000,             # max response length
    base_url="http://localhost:11434/v1",  # model API endpoint
)

🧩 Available Agents

ChatLite supports different agent types for specialized tasks:

  • WebSearchAgent: Internet-enabled chat with web search capabilities
  • RawWebSearchAgent: Direct web search results without summarization
  • EmailAssistantFeature: Email composition and analysis
  • DefaultChatFeature: Standard chat functionality

Example usage:

# Client-side configuration
message_data = {
    "message": "What's the latest news about AI?",
    "model": "llama3.2:latest",
    "agent_type": "WebSearchAgent",
    "is_websearch_chat": True
}

🎨 UI Customization

The included Flutter frontend supports extensive customization:

ThemeData(
  brightness: Brightness.dark,
  scaffoldBackgroundColor: const Color(0xFF1C1C1E),
  primaryColor: const Color(0xFF1C1C1E),
  colorScheme: const ColorScheme.dark(
    primary: Color(0xFFFF7762),
    secondary: Color(0xFFFF7762),
  ),
)

📦 Project Structure

chatlite/
├── __init__.py          # Main package initialization
├── core/               # Core functionality
│   ├── config.py       # Configuration handling
│   ├── model_service.py # Model interaction
│   └── features/       # Feature implementations
├── ui/                 # Flutter frontend
└── examples/           # Usage examples

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Built with FastAPI and Flutter
  • Inspired by modern chat applications
  • Uses various open-source language models

⚠️ Disclaimer

This is an open-source project and should be used responsibly. Please ensure compliance with all model licenses and usage terms.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatlite-0.0.56.tar.gz (19.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatlite-0.0.56-py3-none-any.whl (12.1 kB view details)

Uploaded Python 3

File details

Details for the file chatlite-0.0.56.tar.gz.

File metadata

  • Download URL: chatlite-0.0.56.tar.gz
  • Upload date:
  • Size: 19.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.0rc1 Linux/6.8.0-49-generic

File hashes

Hashes for chatlite-0.0.56.tar.gz
Algorithm Hash digest
SHA256 d10bbbe64ec1fc0b7c7eac98f88bc9eb0ebefed30588e100937037d6eb2d5c21
MD5 f14c1812f252853248caee758352e01f
BLAKE2b-256 f52540586dc049e2e7d0286247683da81e016a4004eac8cbdb5b741e7b6168be

See more details on using hashes here.

File details

Details for the file chatlite-0.0.56-py3-none-any.whl.

File metadata

  • Download URL: chatlite-0.0.56-py3-none-any.whl
  • Upload date:
  • Size: 12.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.0rc1 Linux/6.8.0-49-generic

File hashes

Hashes for chatlite-0.0.56-py3-none-any.whl
Algorithm Hash digest
SHA256 a0ac5d12c4a605246296fced6f100a8d8ec0e74878eda0ca10208ab1e7215fb9
MD5 8c8ef79664e5d26fb5952daf1a259dc4
BLAKE2b-256 4bde946020bfc07e2c5f0814b9f9cdb80ac6cca11e8c20fbf05c6032c603a704

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page