ai powered chatapp for browsing and search
Project description
ChatLite 🤖
A lightweight, extensible chat application framework for building AI-powered chat interfaces. ChatLite provides an easy-to-use platform for integrating various language models with web-based chat applications.
✨ Features
- 🔄 Real-time WebSocket communication
- 🎯 Multi-model support (Llama, Qwen, etc.)
- 🌐 Web search integration
- 🎨 Customizable UI with modern design
- 🔌 Plugin architecture for easy extensions
- 💬 Chat history management
- 🎭 Multiple agent types support
- 📱 Responsive design
🚀 Quick Start
Installation
pip install chatlite
Basic Usage
import chatlite
# Start a simple chat server with Llama 3.2
chatlite.local_llama3p2()
# Or use Qwen 2.5
chatlite.local_qwen2p5()
# Custom configuration
server = chatlite.create_server(
model_type="local",
model_name="llama3.2:latest",
temperature=0.7,
max_tokens=4000
)
server.run()
Pre-configured Models
ChatLite comes with several pre-configured models:
# Use different models directly
from chatlite import mistral_7b_v3, mixtral_8x7b, qwen_72b
# Start Mistral 7B server
mistral_7b_v3()
# Start Mixtral 8x7B server
mixtral_8x7b()
# Start Qwen 72B server
qwen_72b()
💻 Frontend Integration
ChatLite includes a Flutter-based frontend that can be easily customized. Here's a basic example of connecting to the ChatLite server:
final channel = WebSocketChannel.connect(
Uri.parse('ws://localhost:8143/ws/$clientId'),
);
// Send message
channel.sink.add(json.encode({
'message': 'Hello!',
'model': 'llama3.2:latest',
'system_prompt': 'You are a helpful assistant',
'agent_type': 'WebSearchAgent',
'is_websearch_chat': true
}));
// Listen for responses
channel.stream.listen(
(message) {
final data = jsonDecode(message);
if (data['type'] == 'stream') {
print(data['message']);
}
},
onError: (error) => print('Error: $error'),
onDone: () => print('Connection closed'),
);
🔧 Configuration
ChatLite supports various configuration options:
from chatlite import create_server
server = create_server(
model_type="local", # local, huggingface, etc.
model_name="llama3.2:latest",
api_key="your-api-key", # if needed
temperature=0.7, # model temperature
max_tokens=4000, # max response length
base_url="http://localhost:11434/v1", # model API endpoint
)
🧩 Available Agents
ChatLite supports different agent types for specialized tasks:
WebSearchAgent: Internet-enabled chat with web search capabilitiesRawWebSearchAgent: Direct web search results without summarizationEmailAssistantFeature: Email composition and analysisDefaultChatFeature: Standard chat functionality
Example usage:
# Client-side configuration
message_data = {
"message": "What's the latest news about AI?",
"model": "llama3.2:latest",
"agent_type": "WebSearchAgent",
"is_websearch_chat": True
}
🎨 UI Customization
The included Flutter frontend supports extensive customization:
ThemeData(
brightness: Brightness.dark,
scaffoldBackgroundColor: const Color(0xFF1C1C1E),
primaryColor: const Color(0xFF1C1C1E),
colorScheme: const ColorScheme.dark(
primary: Color(0xFFFF7762),
secondary: Color(0xFFFF7762),
),
)
📦 Project Structure
chatlite/
├── __init__.py # Main package initialization
├── core/ # Core functionality
│ ├── config.py # Configuration handling
│ ├── model_service.py # Model interaction
│ └── features/ # Feature implementations
├── ui/ # Flutter frontend
└── examples/ # Usage examples
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Built with FastAPI and Flutter
- Inspired by modern chat applications
- Uses various open-source language models
⚠️ Disclaimer
This is an open-source project and should be used responsibly. Please ensure compliance with all model licenses and usage terms.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatlite-0.0.59.tar.gz.
File metadata
- Download URL: chatlite-0.0.59.tar.gz
- Upload date:
- Size: 20.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.11.0rc1 Linux/6.8.0-49-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9f8b6941b932fb8c5dbb7b6a1b877d3183350cc5663d0bacc0e203bf0f6ab952
|
|
| MD5 |
9331966d1d5c0debb31c22687db1a3d8
|
|
| BLAKE2b-256 |
fb3ed8e9c0c14fea78754f1f8dd5ed38800f9d5e32aadac07b12a97cf3805d01
|
File details
Details for the file chatlite-0.0.59-py3-none-any.whl.
File metadata
- Download URL: chatlite-0.0.59-py3-none-any.whl
- Upload date:
- Size: 12.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.11.0rc1 Linux/6.8.0-49-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dba8f2043274a3b6f821b2cd8a5b007e1f6236d7c4a2406af5433273a47b52bb
|
|
| MD5 |
ffc92d9bbda4326cf6de24e65f82e66a
|
|
| BLAKE2b-256 |
8cfe9b016f70190b5b06d76300161441ebe1703658630975a965309c047e9e65
|