Skip to main content

Bielik โ€” local Ollama chat client (CLI + web)

Project description

terminal

๐Ÿฆ… bielik

PyPI Python License Build Downloads Code style: flake8 Issues Stars Forks

Author: Tom Sapletta
License: Apache-2.0

๐Ÿ‡ต๐Ÿ‡ฑ Bielik is a local chat client for Ollama with CLI and web interfaces, created specifically for the Polish language model Bielik from Speakleash.


๐Ÿ—๏ธ Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                          ๐Ÿฆ… BIELIK                              โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚   ๐Ÿ–ฅ๏ธ  CLI Shell     โ”‚  ๐ŸŒ FastAPI Server   โ”‚  ๐Ÿงช Test Suite     โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚  โ”‚ โ€ข Interactive   โ”‚โ”‚ โ”‚ โ€ข REST /chat      โ”‚ โ”‚ โ”‚ โ€ข Unit tests  โ”‚ โ”‚
โ”‚  โ”‚ โ€ข Help system   โ”‚โ”‚ โ”‚ โ€ข WebSocket /ws   โ”‚ โ”‚ โ”‚ โ€ข Mock API    โ”‚ โ”‚
โ”‚  โ”‚ โ€ข Cross-platformโ”‚โ”‚ โ”‚ โ€ข Port 8888       โ”‚ โ”‚ โ”‚ โ€ข CI/CD       โ”‚ โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
            โ”‚                       โ”‚                       โ”‚
            โ–ผ                       โ–ผ                       โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    ๐Ÿ”„ CONNECTION LAYER                        โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚    REST API (main)   โ”‚โ—„โ”€โ”€โ–บโ”‚   Ollama Library (fallback) โ”‚  โ”‚
โ”‚  โ”‚ โ”Œโ”€ HTTP requests     โ”‚    โ”‚ โ”Œโ”€ ollama.chat()            โ”‚  โ”‚
โ”‚  โ”‚ โ””โ”€ /v1/chat/...      โ”‚    โ”‚ โ””โ”€ Direct integration       โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                      โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                   ๐Ÿฆ™ OLLAMA SERVER                            โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚ ๐Ÿ“ localhost:11434 (default)                            โ”‚  โ”‚
โ”‚  โ”‚ ๐Ÿค– Model: bielik (Polish LLM)                           โ”‚  โ”‚
โ”‚  โ”‚ ๐Ÿ”— Links: Speakleash โ†’ HuggingFace โ†’ Ollama             โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿค– About Bielik Model

Bielik is a groundbreaking Polish language model created by Speakleash - a foundation dedicated to the development of Polish artificial intelligence.

๐Ÿ”— External Dependencies & Links:

๐Ÿš€ How it Works:

  1. Bielik package connects to your local Ollama server
  2. Ollama runs the Bielik model (downloaded from HuggingFace via Speakleash)
  3. Chat interface (CLI/Web) sends queries โ†’ Ollama API โ†’ Bielik model โ†’ responses
  4. Fallback system ensures connectivity (REST API โ†’ ollama library)

๐Ÿ“Œ Features

  • ๐ŸŽฏ Auto-Setup System โ€” Intelligent first-time setup for beginners
    • ๐Ÿ” Detection โ€” Automatically detects missing components
    • ๐Ÿ“ฆ Installation โ€” Cross-platform Ollama installation (Linux/macOS)
    • ๐Ÿš€ Configuration โ€” Downloads and configures Bielik model
    • ๐Ÿ› ๏ธ Interactive โ€” User-friendly prompts and error handling
  • ๐Ÿ–ฅ๏ธ Enhanced CLI bielik โ€” Interactive chat shell with modular architecture
    • ๐Ÿ“‹ Commands โ€” :help, :status, :setup, :clear, :exit, :models, :download, :delete, :switch
    • โš™๏ธ Arguments โ€” --setup, --no-setup, --model, --host, --use-local, --local-model
    • ๐Ÿค— HF Integration โ€” Direct Hugging Face model management and local execution
    • ๐Ÿ”„ Fallback โ€” REST API primary, ollama lib secondary, local models tertiary
    • ๐ŸŒ Cross-platform โ€” Windows, macOS, Linux support
  • ๐Ÿ Python API โ€” Programmatic access via BielikClient class
    • ๐Ÿ’ฌ Chat methods โ€” chat(), query(), conversation management
    • ๐Ÿ”ง System control โ€” Status checking, auto-setup, model management
    • ๐Ÿค— HF Models โ€” Download, manage, and run SpeakLeash models locally
    • ๐Ÿ“ค Export โ€” Conversation history in JSON, text, markdown formats
  • ๐ŸŒ Web Server (FastAPI on port 8888):
    • ๐Ÿ“ก REST โ€” POST /chat endpoint for JSON communication
    • โšก WebSocket โ€” WS /ws for real-time chat
    • ๐Ÿ”„ Fallback โ€” Same dual connectivity as CLI
  • ๐Ÿงช Quality Assurance โ€” Comprehensive testing and development tools
    • โœ… Unit tests โ€” Full coverage with mocked APIs
    • ๐Ÿ”ง CI/CD โ€” GitHub Actions automation
    • ๐Ÿ“Š Code quality โ€” Flake8 linting, automated builds

โš™๏ธ Installation

pip install bielik

Optional dependency (official Ollama lib):

pip install "bielik[ollama]"

๐Ÿš€ Quick Start Guide

๐ŸŽฏ NEW: Automatic Setup (Recommended)

The easiest way to get started is with the new automatic setup system:

# Install Bielik package
pip install bielik

# Start with automatic setup - it will handle everything!
bielik

# Or force setup mode:
bielik --setup

The auto-setup system will:

  • โœ… Detect if Ollama is installed
  • โœ… Install Ollama automatically (Linux/macOS)
  • โœ… Start Ollama server if not running
  • โœ… Download Bielik model (SpeakLeash/bielik-7b-instruct-v0.1-gguf)
  • โœ… Configure everything for optimal performance

๐Ÿ“‹ Manual Setup (Advanced Users)

If you prefer manual control:

1๏ธโƒฃ Install Ollama (Cross-platform)

Windows:

# Download installer from ollama.com or use winget:
winget install Ollama.Ollama

macOS:

# Download .dmg from ollama.com or use Homebrew:
brew install ollama

Linux:

# Ubuntu/Debian:
curl -fsSL https://ollama.com/install.sh | sh

# Arch Linux:
sudo pacman -S ollama

2๏ธโƒฃ Setup Bielik Model

# Start Ollama service (Linux/macOS):
ollama serve

# Windows: Ollama starts automatically

# Install Bielik model (new full model name):
ollama pull SpeakLeash/bielik-7b-instruct-v0.1-gguf

# Verify installation:
ollama list

3๏ธโƒฃ Install & Use Bielik Package

# Install from PyPI:
pip install bielik

# Start CLI chat:
bielik

# Skip auto-setup if you prefer manual control:
bielik --no-setup

# Or start web server:
uvicorn bielik.server:app --port 8888

๐Ÿ’ป Usage

๐Ÿ–ฅ๏ธ CLI Features & Options

Command-Line Arguments (when starting bielik)

# Basic usage
bielik                                    # Start interactive chat with auto-setup

# Advanced options
bielik --setup                           # Force setup mode
bielik --no-setup                        # Skip automatic setup
bielik --model other-model               # Use different model
bielik --host http://other-host:11434    # Use different Ollama server
bielik --use-local                       # Use local HuggingFace models (bypass Ollama)
bielik --local-model model-name          # Specify local model to use
bielik --help                            # Show all options

Interactive Commands (inside bielik chat session)

โš ๏ธ Important: These commands only work inside the interactive chat session, not as command-line arguments.

# Start interactive session first
$ bielik

# Then use these commands inside the chat:
๐Ÿง‘ You: :help             # Show help and commands
๐Ÿง‘ You: :status           # Check Ollama connection and model availability
๐Ÿง‘ You: :setup            # Run interactive setup system
๐Ÿง‘ You: :models           # List available and downloaded HuggingFace models
๐Ÿง‘ You: :download <model> # Download a SpeakLeash model from Hugging Face
๐Ÿง‘ You: :delete <model>   # Delete a downloaded model
๐Ÿง‘ You: :switch <model>   # Switch to a different model for execution
๐Ÿง‘ You: :storage          # Show model storage statistics
๐Ÿง‘ You: :clear            # Clear conversation history
๐Ÿง‘ You: :exit             # Quit (or Ctrl+C)

Usage Examples

# โœ… Correct - Start with specific model
$ bielik --model SpeakLeash/Bielik-4.5B-v3.0-Instruct-GGUF

# โœ… Correct - Interactive commands inside session
$ bielik
๐Ÿง‘ You: :status
๐Ÿง‘ You: :switch SpeakLeash/Bielik-4.5B-v3.0-Instruct-GGUF

# โŒ Incorrect - Interactive commands as CLI arguments
$ bielik :status          # This won't work!
$ bielik :switch model    # This won't work!

๐Ÿ Python API

NEW: Use Bielik programmatically in your Python applications:

from bielik.client import BielikClient

# Create client with auto-setup
client = BielikClient()

# Send a message
response = client.chat("How are you?")
print(response)

# Get system status
status = client.get_status()
print(f"Ollama running: {status['ollama_running']}")
print(f"Model available: {status['model_available']}")

# Export conversation
history = client.export_conversation(format="markdown")

Quick functions:

from bielik.client import quick_chat, get_system_status

# One-off query
response = quick_chat("What is artificial intelligence?")

# Check system without setup
status = get_system_status()

BielikClient Options:

  • model: Model name to use
  • host: Ollama server URL
  • auto_setup: Enable/disable automatic setup (default: True)

Web API

uvicorn bielik.server:app --port 8888

Endpoints:

  • POST /chat - JSON chat endpoint
  • WS /ws - WebSocket real-time chat

Example request:

{"messages": [{"role":"user","content":"Hello!"}]}

๐Ÿ”ง Environment Variables

  • OLLAMA_HOST โ€” default: http://localhost:11434
  • BIELIK_MODEL โ€” default: SpeakLeash/bielik-7b-instruct-v0.1-gguf

๐Ÿ› ๏ธ Troubleshooting

Auto-Setup Issues

Problem: Auto-setup fails to install Ollama

# Manual installation required for Windows
# Download from: https://ollama.com/download/windows

# Linux/macOS alternatives:
curl -fsSL https://ollama.com/install.sh | sh  # Linux
brew install ollama                             # macOS

Problem: Model download fails or times out

# Try manual download with longer timeout
ollama pull SpeakLeash/bielik-7b-instruct-v0.1-gguf

# Check available disk space (model is ~4GB)
df -h

# Check network connection
curl -I https://ollama.com

Problem: Ollama server won't start

# Check if port 11434 is in use
lsof -i :11434  # Linux/macOS
netstat -an | findstr 11434  # Windows

# Try different port
OLLAMA_HOST=http://localhost:11435 ollama serve --port 11435

Runtime Issues

Problem: "Connection refused" errors

# Check Ollama status
bielik :status

bielik :switch SpeakLeash/Bielik-4.5B-v3.0-Instruct-GGUF

# Restart Ollama service
pkill ollama && ollama serve  # Linux/macOS

# Manual server start
ollama serve

Problem: Model responses are slow

# Check system resources
htop      # Linux/macOS
taskmgr   # Windows

# Use smaller model if needed
bielik --model llama2  # If available

Problem: Python API import errors

# Reinstall with dependencies
pip uninstall bielik
pip install bielik[ollama]

# Check Python path
python -c "import bielik.client; print('OK')"

Getting Help

  • GitHub Issues: Report bugs and feature requests
  • Command Help: bielik --help or :help in CLI
  • System Status: Use :status command or get_system_status() function

๐Ÿ“ Development

git clone https://github.com/tomsapletta/bielik.git
cd bielik
python -m venv .venv
source .venv/bin/activate
pip install -e .[ollama]

๐Ÿ“‚ Package Structure

bielik/
โ”œโ”€โ”€ bielik/
โ”‚   โ”œโ”€โ”€ __init__.py          # Package initialization
โ”‚   โ”œโ”€โ”€ cli.py               # CLI entry point (wrapper)
โ”‚   โ”œโ”€โ”€ client.py            # Client entry point (wrapper)
โ”‚   โ”œโ”€โ”€ server.py            # FastAPI web server
โ”‚   โ”œโ”€โ”€ config.py            # Configuration management
โ”‚   โ”œโ”€โ”€ hf_models.py         # Hugging Face model management
โ”‚   โ”œโ”€โ”€ content_processor.py # Content processing utilities
โ”‚   โ”œโ”€โ”€ cli/                 # Modular CLI components
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”‚   โ”œโ”€โ”€ main.py          # Main CLI entry and argument parsing
โ”‚   โ”‚   โ”œโ”€โ”€ commands.py      # Command processing and execution
โ”‚   โ”‚   โ”œโ”€โ”€ models.py        # HF model management CLI
โ”‚   โ”‚   โ”œโ”€โ”€ setup.py         # Interactive setup manager
โ”‚   โ”‚   โ””โ”€โ”€ send_chat.py     # Chat communication handling
โ”‚   โ””โ”€โ”€ client/              # Modular client components
โ”‚       โ”œโ”€โ”€ __init__.py      # Client package exports
โ”‚       โ”œโ”€โ”€ core.py          # Core BielikClient class
โ”‚       โ”œโ”€โ”€ model_manager.py # HF model operations for client
โ”‚       โ””โ”€โ”€ utils.py         # Client utility functions
โ”œโ”€โ”€ tests/
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ test_cli.py          # CLI unit tests
โ”‚   โ””โ”€โ”€ test_server.py       # Server unit tests
โ”œโ”€โ”€ pyproject.toml           # Modern Python packaging
โ”œโ”€โ”€ setup.cfg                # Package configuration
โ”œโ”€โ”€ MANIFEST.in              # Package manifest
โ”œโ”€โ”€ LICENSE                  # Apache 2.0 license
โ”œโ”€โ”€ README.md                # This documentation
โ”œโ”€โ”€ Makefile                 # Development automation
โ”œโ”€โ”€ todo.md                  # Project specifications
โ””โ”€โ”€ .github/workflows/       # CI/CD automation
    โ””โ”€โ”€ python-publish.yml

๐Ÿ“œ License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bielik-0.1.3.tar.gz (51.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bielik-0.1.3-py3-none-any.whl (32.2 kB view details)

Uploaded Python 3

File details

Details for the file bielik-0.1.3.tar.gz.

File metadata

  • Download URL: bielik-0.1.3.tar.gz
  • Upload date:
  • Size: 51.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for bielik-0.1.3.tar.gz
Algorithm Hash digest
SHA256 40d2fe6502844f10e3e03d2c54fe0a4e49e27c9d79166ca20ac79ec78be9cf83
MD5 aa3dbec9e4d20f8211295fd1fda293b8
BLAKE2b-256 cd3e4d5a957ea85fbf2599492d8faef574a2736e6883dc8e2b2781add3f78654

See more details on using hashes here.

File details

Details for the file bielik-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: bielik-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 32.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for bielik-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d71d2d0cc7155b1480eae992e322aa57914fdd7d41eac449970f2338c763483c
MD5 828f6f127990d55d8ddf0f1f566a73f1
BLAKE2b-256 15015cb87db4f1b94582038c934488e96cac7dcabc4afebbf91ce3285b8d17dc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page