Bielik โ local Ollama chat client (CLI + web)
Project description
๐ฆ bielik
Author: Tom Sapletta
License: Apache-2.0
๐ต๐ฑ Bielik is a local chat client for Ollama with CLI and web interfaces, created specifically for the Polish language model Bielik from Speakleash.
๐๏ธ Architecture
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ฆ
BIELIK โ
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโค
โ ๐ฅ๏ธ CLI Shell โ ๐ FastAPI Server โ ๐งช Test Suite โ
โ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโ โ
โ โ โข Interactive โโ โ โข REST /chat โ โ โ โข Unit tests โ โ
โ โ โข Help system โโ โ โข WebSocket /ws โ โ โ โข Mock API โ โ
โ โ โข Cross-platformโโ โ โข Port 8888 โ โ โ โข CI/CD โ โ
โ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโ
โ โ โ
โผ โผ โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ CONNECTION LAYER โ
โ โโโโโโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ REST API (main) โโโโโบโ Ollama Library (fallback) โ โ
โ โ โโ HTTP requests โ โ โโ ollama.chat() โ โ
โ โ โโ /v1/chat/... โ โ โโ Direct integration โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ฆ OLLAMA SERVER โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ ๐ localhost:11434 (default) โ โ
โ โ ๐ค Model: bielik (Polish LLM) โ โ
โ โ ๐ Links: Speakleash โ HuggingFace โ Ollama โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ค About Bielik Model
Bielik is a groundbreaking Polish language model created by Speakleash - a foundation dedicated to the development of Polish artificial intelligence.
๐ External Dependencies & Links:
- Ollama - Local LLM runtime that hosts the Bielik model
- Bielik Model on HuggingFace - Official model repository
- Speakleash Foundation - Creators of the Bielik model
- Polish AI Initiative - Government support for Polish AI
๐ How it Works:
- Bielik package connects to your local Ollama server
- Ollama runs the Bielik model (downloaded from HuggingFace via Speakleash)
- Chat interface (CLI/Web) sends queries โ Ollama API โ Bielik model โ responses
- Fallback system ensures connectivity (REST API โ ollama library)
๐ Features
- ๐ฏ Auto-Setup System โ Intelligent first-time setup for beginners
- ๐ Detection โ Automatically detects missing components
- ๐ฆ Installation โ Cross-platform Ollama installation (Linux/macOS)
- ๐ Configuration โ Downloads and configures Bielik model
- ๐ ๏ธ Interactive โ User-friendly prompts and error handling
- ๐ฅ๏ธ Enhanced CLI
bielikโ Interactive chat shell with modular architecture- ๐ Commands โ
:help,:status,:setup,:clear,:exit,:models,:download,:delete,:switch - โ๏ธ Arguments โ
--setup,--no-setup,--model,--host,--use-local,--local-model - ๐ค HF Integration โ Direct Hugging Face model management and local execution
- ๐ Fallback โ REST API primary, ollama lib secondary, local models tertiary
- ๐ Cross-platform โ Windows, macOS, Linux support
- ๐ Commands โ
- ๐ Python API โ Programmatic access via
BielikClientclass- ๐ฌ Chat methods โ
chat(),query(), conversation management - ๐ง System control โ Status checking, auto-setup, model management
- ๐ค HF Models โ Download, manage, and run SpeakLeash models locally
- ๐ค Export โ Conversation history in JSON, text, markdown formats
- ๐ฌ Chat methods โ
- ๐ Web Server (FastAPI on port 8888):
- ๐ก REST โ
POST /chatendpoint for JSON communication - โก WebSocket โ
WS /wsfor real-time chat - ๐ Fallback โ Same dual connectivity as CLI
- ๐ก REST โ
- ๐งช Quality Assurance โ Comprehensive testing and development tools
- โ Unit tests โ Full coverage with mocked APIs
- ๐ง CI/CD โ GitHub Actions automation
- ๐ Code quality โ Flake8 linting, automated builds
โ๏ธ Installation
pip install bielik
Optional dependency (official Ollama lib):
pip install "bielik[ollama]"
๐ Quick Start Guide
๐ฏ NEW: Automatic Setup (Recommended)
The easiest way to get started is with the new automatic setup system:
# Install Bielik package
pip install bielik
# Start with automatic setup - it will handle everything!
bielik
# Or force setup mode:
bielik --setup
The auto-setup system will:
- โ Detect if Ollama is installed
- โ Install Ollama automatically (Linux/macOS)
- โ Start Ollama server if not running
- โ
Download Bielik model (
SpeakLeash/bielik-7b-instruct-v0.1-gguf) - โ Configure everything for optimal performance
๐ Manual Setup (Advanced Users)
If you prefer manual control:
1๏ธโฃ Install Ollama (Cross-platform)
Windows:
# Download installer from ollama.com or use winget:
winget install Ollama.Ollama
macOS:
# Download .dmg from ollama.com or use Homebrew:
brew install ollama
Linux:
# Ubuntu/Debian:
curl -fsSL https://ollama.com/install.sh | sh
# Arch Linux:
sudo pacman -S ollama
2๏ธโฃ Setup Bielik Model
# Start Ollama service (Linux/macOS):
ollama serve
# Windows: Ollama starts automatically
# Install Bielik model (new full model name):
ollama pull SpeakLeash/bielik-7b-instruct-v0.1-gguf
# Verify installation:
ollama list
3๏ธโฃ Install & Use Bielik Package
# Install from PyPI:
pip install bielik
# Start CLI chat:
bielik
# Skip auto-setup if you prefer manual control:
bielik --no-setup
# Or start web server:
uvicorn bielik.server:app --port 8888
๐ป Usage
๐ฅ๏ธ CLI Features & Options
Command-Line Arguments (when starting bielik)
# Basic usage
bielik # Start interactive chat with auto-setup
# Advanced options
bielik --setup # Force setup mode
bielik --no-setup # Skip automatic setup
bielik --model other-model # Use different model
bielik --host http://other-host:11434 # Use different Ollama server
bielik --use-local # Use local HuggingFace models (bypass Ollama)
bielik --local-model model-name # Specify local model to use
bielik --help # Show all options
Interactive Commands (inside bielik chat session)
โ ๏ธ Important: These commands only work inside the interactive chat session, not as command-line arguments.
# Start interactive session first
$ bielik
# Then use these commands inside the chat:
๐ง You: :help # Show help and commands
๐ง You: :status # Check Ollama connection and model availability
๐ง You: :setup # Run interactive setup system
๐ง You: :models # List available and downloaded HuggingFace models
๐ง You: :download <model> # Download a SpeakLeash model from Hugging Face
๐ง You: :delete <model> # Delete a downloaded model
๐ง You: :switch <model> # Switch to a different model for execution
๐ง You: :storage # Show model storage statistics
๐ง You: :clear # Clear conversation history
๐ง You: :exit # Quit (or Ctrl+C)
Usage Examples
# โ
Correct - Start with specific model
$ bielik --model SpeakLeash/Bielik-4.5B-v3.0-Instruct-GGUF
# โ
Correct - Interactive commands inside session
$ bielik
๐ง You: :status
๐ง You: :switch SpeakLeash/Bielik-4.5B-v3.0-Instruct-GGUF
# โ Incorrect - Interactive commands as CLI arguments
$ bielik :status # This won't work!
$ bielik :switch model # This won't work!
๐ Python API
NEW: Use Bielik programmatically in your Python applications:
from bielik.client import BielikClient
# Create client with auto-setup
client = BielikClient()
# Send a message
response = client.chat("How are you?")
print(response)
# Get system status
status = client.get_status()
print(f"Ollama running: {status['ollama_running']}")
print(f"Model available: {status['model_available']}")
# Export conversation
history = client.export_conversation(format="markdown")
Quick functions:
from bielik.client import quick_chat, get_system_status
# One-off query
response = quick_chat("What is artificial intelligence?")
# Check system without setup
status = get_system_status()
BielikClient Options:
model: Model name to usehost: Ollama server URLauto_setup: Enable/disable automatic setup (default: True)
Web API
uvicorn bielik.server:app --port 8888
Endpoints:
POST /chat- JSON chat endpointWS /ws- WebSocket real-time chat
Example request:
{"messages": [{"role":"user","content":"Hello!"}]}
๐ง Environment Variables
OLLAMA_HOSTโ default:http://localhost:11434BIELIK_MODELโ default:SpeakLeash/bielik-7b-instruct-v0.1-gguf
๐ ๏ธ Troubleshooting
Auto-Setup Issues
Problem: Auto-setup fails to install Ollama
# Manual installation required for Windows
# Download from: https://ollama.com/download/windows
# Linux/macOS alternatives:
curl -fsSL https://ollama.com/install.sh | sh # Linux
brew install ollama # macOS
Problem: Model download fails or times out
# Try manual download with longer timeout
ollama pull SpeakLeash/bielik-7b-instruct-v0.1-gguf
# Check available disk space (model is ~4GB)
df -h
# Check network connection
curl -I https://ollama.com
Problem: Ollama server won't start
# Check if port 11434 is in use
lsof -i :11434 # Linux/macOS
netstat -an | findstr 11434 # Windows
# Try different port
OLLAMA_HOST=http://localhost:11435 ollama serve --port 11435
Runtime Issues
Problem: "Connection refused" errors
# Check Ollama status
bielik :status
bielik :switch SpeakLeash/Bielik-4.5B-v3.0-Instruct-GGUF
# Restart Ollama service
pkill ollama && ollama serve # Linux/macOS
# Manual server start
ollama serve
Problem: Model responses are slow
# Check system resources
htop # Linux/macOS
taskmgr # Windows
# Use smaller model if needed
bielik --model llama2 # If available
Problem: Python API import errors
# Reinstall with dependencies
pip uninstall bielik
pip install bielik[ollama]
# Check Python path
python -c "import bielik.client; print('OK')"
Getting Help
- GitHub Issues: Report bugs and feature requests
- Command Help:
bielik --helpor:helpin CLI - System Status: Use
:statuscommand orget_system_status()function
๐ Development
git clone https://github.com/tomsapletta/bielik.git
cd bielik
python -m venv .venv
source .venv/bin/activate
pip install -e .[ollama]
๐ Package Structure
bielik/
โโโ bielik/
โ โโโ __init__.py # Package initialization
โ โโโ cli.py # CLI entry point (wrapper)
โ โโโ client.py # Client entry point (wrapper)
โ โโโ server.py # FastAPI web server
โ โโโ config.py # Configuration management
โ โโโ hf_models.py # Hugging Face model management
โ โโโ content_processor.py # Content processing utilities
โ โโโ cli/ # Modular CLI components
โ โ โโโ __init__.py
โ โ โโโ main.py # Main CLI entry and argument parsing
โ โ โโโ commands.py # Command processing and execution
โ โ โโโ models.py # HF model management CLI
โ โ โโโ setup.py # Interactive setup manager
โ โ โโโ send_chat.py # Chat communication handling
โ โโโ client/ # Modular client components
โ โโโ __init__.py # Client package exports
โ โโโ core.py # Core BielikClient class
โ โโโ model_manager.py # HF model operations for client
โ โโโ utils.py # Client utility functions
โโโ tests/
โ โโโ __init__.py
โ โโโ test_cli.py # CLI unit tests
โ โโโ test_server.py # Server unit tests
โโโ pyproject.toml # Modern Python packaging
โโโ setup.cfg # Package configuration
โโโ MANIFEST.in # Package manifest
โโโ LICENSE # Apache 2.0 license
โโโ README.md # This documentation
โโโ Makefile # Development automation
โโโ todo.md # Project specifications
โโโ .github/workflows/ # CI/CD automation
โโโ python-publish.yml
๐ License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bielik-0.1.3.tar.gz.
File metadata
- Download URL: bielik-0.1.3.tar.gz
- Upload date:
- Size: 51.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40d2fe6502844f10e3e03d2c54fe0a4e49e27c9d79166ca20ac79ec78be9cf83
|
|
| MD5 |
aa3dbec9e4d20f8211295fd1fda293b8
|
|
| BLAKE2b-256 |
cd3e4d5a957ea85fbf2599492d8faef574a2736e6883dc8e2b2781add3f78654
|
File details
Details for the file bielik-0.1.3-py3-none-any.whl.
File metadata
- Download URL: bielik-0.1.3-py3-none-any.whl
- Upload date:
- Size: 32.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d71d2d0cc7155b1480eae992e322aa57914fdd7d41eac449970f2338c763483c
|
|
| MD5 |
828f6f127990d55d8ddf0f1f566a73f1
|
|
| BLAKE2b-256 |
15015cb87db4f1b94582038c934488e96cac7dcabc4afebbf91ce3285b8d17dc
|