Bielik โ Local-first AI assistant with HuggingFace and SpeakLeash models
Project description
๐ฆ Bielik
Bielik is a groundbreaking Polish language model created by Speakleash - a foundation dedicated to the development of Polish artificial intelligence.
๐ต๐ฑ Bielik is available on huggingface.
Now you can test it directly in the shell, using several features that allow you to build a multi-agent environment in your company ๐
- ๐ฏ HuggingFace Integration - Direct model downloads from HF Hub
- ๐ฌ Polish Language Optimized - Built for Polish conversation and analysis
- ๐ผ๏ธ Vision Capabilities - Image analysis and visual question answering
- ๐ Document Processing - PDF, DOCX, web content analysis
- ๐ณ Docker Ready - Containerized testing environments
- โก Lightweight - Minimal (~50MB) or Full (~2GB) installation options
Author: Tom Sapletta
License: Apache-2.0
๐ Navigation Menu
- ๐๏ธ Architecture
- โก Quick Start
- ๐ ๏ธ Installation
- ๐ฏ Context Provider Commands
- ๐ฌ Usage Examples
- ๐ณ Docker Testing Framework
- ๐ Documentation
- ๐ค Contributing
๐๏ธ Architecture
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ฆ
BIELIK SYSTEM โ
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโค
โ ๐ฅ๏ธ CLI Shell โ ๐ FastAPI Server โ ๐ณ Docker Tests โ
โ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโ โ
โ โ โข Interactive โโ โ โข REST /chat โ โ โ โข Minimal โ โ
โ โ โข Personalized โโ โ โข WebSocket /ws โ โ โ โข Full โ โ
โ โ โข Multi-modal โโ โ โข Port 8000 โ โ โ โข CI/CD โ โ
โ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโ
โ โ โ
โผ โผ โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ค HUGGINGFACE INTEGRATION โ
โ โโโโโโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Direct Downloads โโโโโบโ Local Model Execution โ โ
โ โ โโ HF Hub API โ โ โโ Transformers Pipeline โ โ
โ โ โโ Model Management โ โ โโ Vision Models (optional) โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ POLISH LANGUAGE MODELS โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ ๐ค Speakleash/Bielik Models (HuggingFace Hub) โ โ
โ โ ๐ Direct: HuggingFace โ Local Storage โ Execution โ โ
โ โ ๐ฏ Polish-optimized conversation and analysis โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ค About Bielik Model
Bielik is a groundbreaking Polish language model created by Speakleash - a foundation dedicated to the development of Polish artificial intelligence.
๐ External Dependencies & Links:
- HuggingFace Hub - Primary source for Polish language models
- Bielik Models on HuggingFace - Official model repository
- Speakleash Foundation - Creators of the Bielik models
- Polish AI Initiative - Government support for Polish AI
๐ How it Works:
- Bielik CLI connects directly to HuggingFace Hub
- Models are downloaded from Speakleash organization on HuggingFace
- Local execution uses Transformers library (optional for vision)
- Chat interface (CLI/Web) โ Local Models โ Polish responses
- Modular design supports text-only or full vision capabilities
๐ Features
- ๐ฏ HuggingFace Integration โ Direct model downloads and management
- ๐ Model Discovery โ Browse available Polish language models
- ๐ฆ Smart Downloads โ Automatic model caching and versioning
- ๐ Auto-Switch โ Automatically switches to newly downloaded models
- ๐ ๏ธ Interactive โ User-friendly model selection on first startup
- ๐ฅ๏ธ Enhanced CLI
python -m bielikโ Personalized chat experience- ๐ Commands โ
:help,:models,:download,:delete,:switch,:settings,:name - โ๏ธ Personalization โ Custom user names, dynamic assistant names
- ๐ค HF Management โ Direct HuggingFace model operations
- ๐ Content Analysis โ Folder scanning, document processing
- ๐ Cross-platform โ Windows, macOS, Linux support
- ๐ Commands โ
- ๐ผ๏ธ Vision Capabilities (Full version only)
- ๐ Image Analysis โ Automatic image captioning in Polish
- โ Visual QA โ Ask questions about images
- ๐จ Multi-modal โ Combined text and image understanding
- โก GPU Support โ Hardware acceleration for faster processing
- ๐ Python API โ Programmatic access via
BielikClientclass- ๐ฌ Chat methods โ
chat(),query(), conversation management - ๐ง Model control โ Download, switch, and manage models
- ๐ค HF Models โ Full HuggingFace integration
- ๐ค Export โ Conversation history in multiple formats
- ๐ฌ Chat methods โ
- ๐ Web Server (FastAPI on port 8000):
- ๐ก REST โ
POST /chatendpoint for JSON communication - โก WebSocket โ
WS /wsfor real-time chat - ๐ผ๏ธ Multi-modal โ Support for text and image inputs
- ๐ก REST โ
- ๐ณ Docker Support โ Complete containerized testing
- ๐ฆ Minimal Version โ Lightweight text-only container (~50MB)
- ๐ฏ Full Version โ Complete vision-enabled container (~2GB)
- ๐ง CI/CD โ Automated testing environments
โก Performance Optimized Installation
Bielik now features lazy loading and optimized startup for faster performance. We recommend using Conda for the most reliable installation experience.
๐ Installation Guide
๐ Conda Installation (Recommended)
Prerequisites
1. Create and Activate Environment
# Create a new conda environment
conda create -n bielik python=3.11 -y
conda activate bielik
2. Install System Dependencies
conda install -c conda-forge -y \
cmake \
make \
gcc_linux-64 \
gxx_linux-64 \
libgcc \
libstdcxx-ng
3. Install PyTorch (CPU or GPU)
For CPU-only:
conda install -c pytorch -y \
pytorch \
torchvision \
torchaudio \
cpuonly
For NVIDIA GPU (CUDA 11.8):
conda install -c pytorch -y \
pytorch \
torchvision \
torchaudio \
pytorch-cuda=11.8 \
-c nvidia
4. Install Bielik and Dependencies
# Install Hugging Face libraries
conda install -c huggingface -y \
transformers \
accelerate \
sentencepiece \
huggingface-hub \
bitsandbytes
# Install llama-cpp-python with optimized flags
CMAKE_ARGS="-DLLAMA_NO_METAL=on" FORCE_CMAKE=1 \
pip install llama-cpp-python --no-cache-dir
# Install Bielik in development mode
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
pip install -e ".[local]"
5. Verify Installation
python scripts/verify_installation.py
โก Optimizing Model Loading
Bielik now includes optimized model loading with a 10-second timeout and debug mode:
# Start Bielik with debug mode (shows detailed loading information)
BIELIK_DEBUG=1 bielik
# Set a custom timeout (in seconds)
BIELIK_LOAD_TIMEOUT=15 bielik
If a model takes longer than the timeout to load, the debugger will automatically activate and show detailed logs to help diagnose the issue.
๐ฅ๏ธ Platform-Specific Notes
Windows
# Use Anaconda Prompt with Administrator privileges
# Install Visual Studio Build Tools with C++ workload
macOS (Intel/Apple Silicon)
# For Apple Silicon, use this flag when installing llama-cpp-python:
CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 \
pip install llama-cpp-python --no-cache-dir
Linux
# Install system dependencies
sudo apt-get update && sudo apt-get install -y \
build-essential \
libssl-dev \
zlib1g-dev \
libbz2-dev \
libreadline-dev \
libsqlite3-dev \
wget \
llvm \
libncurses5-dev \
libncursesw5-dev \
xz-utils \
tk-dev \
libffi-dev \
liblzma-dev \
python-openssl \
git
๐ ๏ธ Troubleshooting
Model Loading Issues
If models are loading slowly or timing out:
- Check your internet connection
- Verify you have enough disk space (~10GB free recommended)
- Enable debug mode:
BIELIK_DEBUG=1 bielik - Try clearing the model cache:
rm -rf ~/.cache/huggingface/hub
GPU Acceleration
To enable GPU acceleration, ensure you have:
- Latest NVIDIA drivers installed
- CUDA toolkit matching your PyTorch version
- Properly configured
LD_LIBRARY_PATH
Common Errors
- CUDA out of memory: Reduce batch size or use a smaller model
- Missing libraries: Install required system dependencies
- Permission errors: Run with appropriate permissions or use
--userflag
๐ Manual Installation
# 1. Clone repository
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
# 2. Run universal installer
python install.py
๐ง Troubleshooting
Common Issues
1. Conda Environment Issues
Problem: Conda command not found
Solution:
# Add Conda to PATH
export PATH="$HOME/miniconda3/bin:$PATH"
# Initialize Conda for your shell
eval "$(conda shell.bash hook)"
2. CUDA/GPU Support
Problem: CUDA not detected
Solution: Install CUDA toolkit and configure Conda:
conda install -c nvidia cuda-toolkit
conda install -c conda-forge cudatoolkit-dev
3. Slow Model Loading
Problem: First load is slow
Solution: This is normal for the first load. Subsequent loads will be faster thanks to lazy loading.
4. Missing Dependencies
Problem: Missing system libraries
Solution: Install required system packages:
# Ubuntu/Debian
sudo apt-get update && sudo apt-get install -y \
build-essential \
cmake \
libopenblas-dev \
liblapack-dev \
libjpeg-dev
# CentOS/RHEL
sudo yum groupinstall -y "Development Tools"
sudo yum install -y cmake openblas-devel lapack-devel libjpeg-turbo-devel
Verifying Your Installation
Run the verification script to check for common issues:
python scripts/verify_installation.py
Getting Help
If you encounter any issues, please:
- Check the GitHub Issues for known problems
- Run the verification script and include its output when reporting issues
- Provide details about your system and the exact error message
๐ Performance Tips
- First Run: The first run will be slower as models are loaded and cached
- Subsequent Runs: Enjoy faster startup times thanks to lazy loading
- Memory Usage: Close other memory-intensive applications when running Bielik
- GPU Acceleration: For best performance, use a system with CUDA-compatible GPU
๐ช Windows Users
REM Double-click install.bat or run:
install.bat
REM Launch Bielik:
run.bat
๐ง Linux/macOS Users
# Run installer:
./install.sh
# Launch Bielik:
./run.sh
โ๏ธ Installation Options
# Basic installation (recommended)
python install.py
# Skip AI models (fastest installation)
python install.py --skip-ai
# Use conda/mamba instead of pip
python install.py --conda
# Development installation
python install.py --dev
# Show all options
python install.py --help
๐ฏ What the Universal Installer Does:
- โ Auto-detects your operating system and Python version
- โ Creates isolated virtual environment
- โ Installs all dependencies with smart fallback strategies
- โ Attempts multiple llama-cpp-python installation methods
- โ Creates platform-specific launcher scripts
- โ Works even if AI models fail to install (Context Provider Commands still work)
- โ Provides clear next steps and troubleshooting
๐ After Installation
# Universal launcher (any platform)
python run.py
# Platform-specific launchers
./run.sh # Linux/macOS
run.bat # Windows
# Or direct activation
.venv/bin/python -m bielik.cli.main # Linux/macOS
.venv\Scripts\python -m bielik.cli.main # Windows
๐ฆ Alternative: PyPI Installation
For users who prefer traditional pip installation:
๐ชถ Minimal Version (Text-only)
# Install minimal version (text-only)
pip install bielik
# Start CLI and download your first model
python -m bielik
What's included:
- โ Polish conversation and text analysis
- โ HuggingFace model downloads and management
- โ Document processing (PDF, DOCX, TXT)
- โ Web content analysis
- โ Context Provider Commands (folder:, calc:, pdf:)
- โ Personalized CLI experience
- โ Local AI models (requires manual llama-cpp-python installation)
๐ฏ Full Version (With vision)
# Install full version (text + vision)
pip install bielik[vision]
# Start CLI with image analysis support
python -m bielik
What's included:
- โ Everything from minimal version
- โ Image analysis and captioning
- โ Visual question answering
- โ GPU acceleration support
- โ Multi-modal document processing
๐ Upgrade Options
# Upgrade minimal โ full
pip install bielik[vision]
# Or install specific optional features
pip install bielik[local] # Local model execution
pip install bielik[gpu] # GPU acceleration
pip install bielik[dev] # Development tools
๐ Quick Start Guide
๐ฏ Instant Start (No setup required!)
Bielik now works without any external dependencies. Just install and start chatting:
# Install minimal version
pip install bielik
# Start CLI and choose your first model
python -m bielik
What happens on first run:
- ๐ Model Selection โ Choose from available Polish models
- ๐ฅ Auto Download โ Selected model downloads from HuggingFace
- ๐ Auto Switch โ Automatically switches to the new model
- ๐ฌ Ready to Chat โ Start conversing in Polish immediately!
๐ฑ Choose Your Experience
๐ชถ Minimal Setup (Recommended)
Perfect for text conversations and document analysis:
# 1. Install
pip install bielik
# 2. Start and download your first model
python -m bielik
:download speakleash/bielik-4.5b-v3.0-instruct
# 3. Start chatting in Polish!
Czeลฤ! Jak mogฤ Ci pomรณc?
๐ฏ Full Setup (For image analysis)
Includes vision capabilities for image analysis:
# 1. Install with vision support
pip install bielik[vision]
# 2. Start and download models
python -m bielik
:download speakleash/bielik-4.5b-v3.0-instruct
# 3. Analyze images
Przeanalizuj to zdjฤcie: image.jpg
๐ณ Docker Setup (For testing)
Use Docker for isolated testing environments:
# Clone repository
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
# Test minimal version
docker-compose --profile minimal up bielik-minimal
# Test full version
docker-compose --profile full up bielik-full
๐ฏ Context Provider Commands
Bielik CLI features a standardized Context Provider Command system that allows you to generate structured data for AI analysis. All external commands use the format name: (with colon after the command name).
๐ Available Commands
| Command | Format | Description | Example |
|---|---|---|---|
| ๐ Folder Analysis | folder: <path> |
Directory scanning and file analysis | folder: ~/Documents |
| ๐งฎ Calculator | calc: <expression> |
Mathematical calculations and evaluations | calc: 2 + 3 * 4 |
| ๐ Document Reader | pdf: <file> |
Text extraction from PDF, DOCX, TXT files | pdf: report.pdf |
๐ก How Context Provider Commands Work
- Generate Context: Commands analyze input and create structured data
- AI Integration: Data is formatted for AI consumption and analysis
- Independent Operation: Commands work without AI models installed
- Project Integration: Artifacts are automatically saved to your active project
- Standardized Format: All external commands use
name:format for consistency
๐ Usage Examples
# Directory Analysis
bielik -p "folder: ."
bielik -p "folder: ~/Documents --recursive"
# Mathematical Calculations
bielik -p "calc: 2 + 3 * 4"
bielik -p "calc: sqrt(16) + sin(pi/2)"
bielik -p "calc: factorial(5)"
# Document Processing
bielik -p "pdf: document.pdf"
bielik -p "pdf: report.docx --metadata"
bielik -p "pdf: file.pdf --pages=1-5"
โก Interactive Usage
In interactive mode, Context Provider Commands load data for AI analysis:
python -m bielik
# Load directory context
> folder: ~/projects
โ
Context loaded! You can now ask questions about the provided data.
# Ask AI about the loaded context
> What files are in this directory?
> Which file is the largest?
> Are there any Python files?
๐ Project Management System
Bielik features a comprehensive session-based project management system that organizes your analysis artifacts into beautiful, shareable HTML projects.
๐ฏ Key Features
- ๐๏ธ Multi-Project Sessions - Work on multiple projects simultaneously
- ๐ HTML Artifact Generation - Beautiful, interactive project representations
- ๐ Automatic Artifact Collection - Context Provider Commands auto-save to active project
- ๐ Browser-Friendly Viewing - Projects open in your default browser
- ๐ Rich Metadata - Embedded project information and artifact tracking
- โ Validation System - Comprehensive integrity checking
๐ Project Management Commands
| Command | Description | Example |
|---|---|---|
:project create <name> |
Create new project | :project create "Data Analysis" |
:project create <name> <desc> --tags <tags> |
Create with metadata | :project create "ML Project" "Machine learning analysis" --tags ml,data |
:project switch <id|name> |
Switch to project | :project switch data-analysis |
:project list |
List all projects | :project list |
:project info [id] |
Show project details | :project info |
:project open [id] |
Open in browser | :project open |
:project validate [id] |
Validate project | :project validate |
๐จ Project Workflow
# 1. Create a new project
python -m bielik
> :project create "Website Analysis" "Analyzing project structure"
โ
Project Created Successfully
๐ฏ Project is now active. Use Context Provider Commands to add artifacts.
# 2. Add artifacts using Context Provider Commands
> folder: ~/my-website
๐ฏ Artifact Added to Project: Website Analysis
> calc: file_count * average_size / 1024
๐ฏ Artifact Added to Project: Website Analysis
> pdf: documentation.pdf
๐ฏ Artifact Added to Project: Website Analysis
# 3. View your project
> :project open
๐ Project Opened in Browser
# 4. Switch between projects
> :project list
๐ Bielik Projects (Current Session)
๐ฏ **ACTIVE** Website Analysis
๐ ID: abc123ef...
๐ Artifacts: 3
๐
Created: 2024-12-20 14:30
๐ Data Analysis
๐ ID: def456gh...
๐ Artifacts: 5
๐
Created: 2024-12-20 12:15
> :project switch def456gh
โ
Switched to Project: Data Analysis
๐๏ธ HTML Project Structure
Each project generates a beautiful HTML file with:
- ๐ Interactive Dashboard - Project overview with metadata
- ๐จ Beautiful Design - Modern, responsive interface
- ๐ Artifact Viewer - Organized display of all artifacts
- ๐ Rich Metadata - Embedded in HTML attributes for programmatic access
- ๐ Offline Capability - Works without internet connection
Example Project Structure:
./bielik_projects/
โโโ abc123ef-ghij-klmn-opqr-stuvwxyz0123/
โ โโโ index.html # Interactive project page
โ โโโ metadata.json # Project metadata
โโโ def456gh-ijkl-mnop-qrst-uvwxyz012345/
โโโ index.html
โโโ metadata.json
๐ Validation System
Bielik includes comprehensive validators for:
- ๐ HTML Artifacts - Structure, metadata, and compliance checking
- โ๏ธ Environment Files - .env configuration validation
- ๐ Command Scripts - Code quality and compliance verification
# Validate current project
> :project validate
๐ Project Validation Results
๐ Status: โ
VALID (0 errors, 0 warnings)
# Validate specific project
> :project validate abc123ef
๐ป Usage
๐ฅ๏ธ CLI Commands & Options
Starting Bielik
# Basic usage
python -m bielik # Start interactive chat
bielik # Alternative (if in PATH)
# Advanced options
python -m bielik --help # Show all options
Interactive Commands (inside chat session)
๐ Model Management:
:models # List available HuggingFace models
:download <model-name> # Download model from HuggingFace
:switch <model-name> # Switch to downloaded model
:delete <model-name> # Delete model from local storage
โ๏ธ Personalization:
:name <your-name> # Set your display name
:settings # Show current configuration
๐ ๏ธ Utilities:
:help # Show all commands
:clear # Clear conversation history
:exit # Quit (or Ctrl+C)
Usage Examples
First Time Setup:
$ python -m bielik
# Choose from recommended models:
# 1. speakleash/bielik-4.5b-v3.0-instruct (Recommended)
# 2. speakleash/bielik-7b-instruct-v0.1
# Enter choice: 1
๐ Downloading speakleash/bielik-4.5b-v3.0-instruct...
โ
Model ready! Switching to bielik-4.5b-v3.0-instruct
๐ค You: Czeลฤ! Jak siฤ masz?
๐ค bielik-4.5b: Czeลฤ! Mam siฤ dobrze, dziฤkujฤ...
Everyday Usage:
๐ค You: :name Jan
โ
Display name set to: Jan
๐ค Jan: Przeanalizuj folder ~/dokumenty
๐ค bielik-4.5b: [Analyzes folder structure and contents]
๐ค Jan: Opisz to zdjฤcie: vacation.jpg # (Full version only)
๐ค bielik-4.5b: [Describes image in Polish]
๐ Python API
Use Bielik programmatically in your Python applications:
from bielik.client import BielikClient
# Create client (auto-downloads model if needed)
client = BielikClient()
# Send a Polish message
response = client.chat("Napisz krรณtki wiersz o Polsce")
print(response)
# Get model status
status = client.get_status()
print(f"Current model: {status['current_model']}")
print(f"Models available: {status['models_available']}")
# Export conversation
history = client.export_conversation(format="markdown")
Quick functions:
from bielik.client import quick_chat
# One-off Polish query
response = quick_chat("Co to jest sztuczna inteligencja?")
print(response)
BielikClient Options:
model: Specific HuggingFace model to useauto_download: Auto-download model if missing (default: True)
๐ Web Server
# Start web server
uvicorn bielik.server:app --port 8000
# Or with Docker
docker-compose --profile minimal up bielik-minimal
Endpoints:
POST /chat- JSON chat endpointWS /ws- WebSocket real-time chatGET /models- List available models
Example request:
{"messages": [{"role":"user","content":"Czeลฤ! Jak siฤ masz?"}]}
๐ณ Docker Usage
Quick Testing:
# Test minimal version
docker run -it bielik:minimal
# Test full version with GPU
docker run --gpus all -it bielik:full
With persistent storage:
# Minimal with model persistence
docker run -it -v $(pwd)/models:/app/models bielik:minimal
# Full with models and images
docker run -it \
-v $(pwd)/models:/app/models \
-v $(pwd)/images:/app/images \
bielik:full
Development setup:
# Clone and test
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
# Run automated tests
docker-compose --profile test up test-runner
# Interactive development
docker-compose --profile minimal run --rm bielik-minimal bash
๐ง Environment Variables
Core Settings:
BIELIK_CLI_USERNAMEโ Your display name in CLI (auto-detected from system)BIELIK_CLI_CURRENT_MODELโ Currently selected modelBIELIK_CLI_ASSISTANT_NAMEโ Assistant display name (auto-set from model)BIELIK_CLI_AUTO_SWITCHโ Auto-switch to newly downloaded models (default: true)
Storage & Cache:
BIELIK_MODELS_DIRโ Local model storage directoryBIELIK_DATA_DIRโ User data and settings directoryHF_HOMEโ HuggingFace cache directory
Docker Environment:
BIELIK_MODEโminimalorfull(Docker only)BIELIK_IMAGES_DIRโ Images directory for analysis (full version)
๐ ๏ธ Troubleshooting
Installation Issues
Problem: Minimal version works but vision features don't
# Upgrade to full version
pip install bielik[vision]
# Verify vision packages
python -c "import PIL, transformers; print('Vision packages OK')"
Problem: Model download fails or times out
# Check HuggingFace connectivity
python -c "from huggingface_hub import HfApi; print('HF connection OK')"
# Check available disk space (models are 2-8GB)
df -h
# Manual download with timeout
:download speakleash/bielik-4.5b-v3.0-instruct
Problem: "No models available" on first startup
# Download a model manually
python -m bielik
:models
:download speakleash/bielik-4.5b-v3.0-instruct
Runtime Issues
Problem: Model responses are slow or use too much memory
# Use smaller model
:switch speakleash/bielik-4.5b-v3.0-instruct # instead of 7b
# Check system resources
htop # Linux/macOS
taskmgr # Windows
# Enable GPU acceleration (full version)
pip install bielik[gpu]
Problem: Image analysis not working
# Check if vision packages installed
python -c "from bielik.image_analyzer import ImageAnalyzer; ia = ImageAnalyzer(); print(f'Available: {ia.is_available()}')"
# Install vision support
pip install bielik[vision]
Problem: CLI settings not persisting
# Check .env file creation
ls -la ~/.bielik/ || ls -la ./
# Reset settings
:name YourName
:settings
Docker Issues
Problem: Docker containers fail to start
# Build images manually
docker build -f docker/Dockerfile.minimal -t bielik:minimal .
docker build -f docker/Dockerfile.full -t bielik:full .
# Check container logs
docker logs bielik-minimal
Problem: Models not persisting between Docker runs
# Use volume mounts
docker run -v $(pwd)/models:/app/models bielik:minimal
# Or use Docker Compose
docker-compose --profile minimal up
Getting Help
- GitHub Issues: Report bugs and feature requests
- Command Help:
python -m bielik --helpor:helpin CLI - Test Environment: Use Docker for isolated testing
- Check Status: Use
:settingscommand for current configuration
๐ ๏ธ Creating Custom Context Provider Commands
Bielik CLI supports extensible Context Provider Commands that allow you to create custom name: commands that generate structured context data for AI analysis.
๐ Quick Setup Guide
1. Create command directory:
mkdir -p commands/mycommand
cd commands/mycommand
2. Create your command files:
main.py- Your command implementationconfig.json- Command configuration and metadata
๐ Python Code Examples
Basic Context Provider Command Template
# commands/mycommand/main.py
from typing import List, Dict, Any
from bielik.cli.command_api import ContextProviderCommand
class MyCommand(ContextProviderCommand):
def __init__(self):
super().__init__(
name="mycommand",
description="Custom context provider that analyzes something",
usage="mycommand: <path_or_input>"
)
def provide_context(self, args: List[str], context: Dict[str, Any]) -> Dict[str, Any]:
"""
Generate context data from the provided arguments.
Args:
args: List of arguments after 'mycommand:'
context: Additional context data
Returns:
Dict containing structured context data
"""
if not args:
return {
"error": "Please provide input after 'mycommand:'",
"usage": self.get_usage()
}
input_data = " ".join(args)
# Your custom logic here
analysis_result = self._analyze_input(input_data)
return {
"command": "mycommand",
"input": input_data,
"analysis": analysis_result,
"timestamp": context.get("timestamp", ""),
"context_type": "custom_analysis"
}
def _analyze_input(self, input_data: str) -> Dict[str, Any]:
"""Custom analysis logic - implement your functionality here."""
return {
"input_length": len(input_data),
"word_count": len(input_data.split()),
"type": "text_analysis",
"summary": f"Analyzed text with {len(input_data)} characters"
}
# Required: Command instance for the registry
command = MyCommand()
Advanced Example: Git Repository Analyzer
# commands/git/main.py
import os
import subprocess
from typing import List, Dict, Any
from bielik.cli.command_api import ContextProviderCommand
class GitCommand(ContextProviderCommand):
def __init__(self):
super().__init__(
name="git",
description="Analyzes Git repository information and recent commits",
usage="git: [path_to_repo]"
)
def provide_context(self, args: List[str], context: Dict[str, Any]) -> Dict[str, Any]:
repo_path = args[0] if args else "."
if not self._is_git_repo(repo_path):
return {
"error": f"Not a git repository: {repo_path}",
"suggestion": "Run this command in a git repository directory"
}
return {
"command": "git",
"repository_path": os.path.abspath(repo_path),
"branch_info": self._get_branch_info(repo_path),
"recent_commits": self._get_recent_commits(repo_path),
"status": self._get_status(repo_path),
"remote_info": self._get_remote_info(repo_path),
"context_type": "git_analysis"
}
def _is_git_repo(self, path: str) -> bool:
"""Check if directory is a git repository."""
try:
result = subprocess.run(
["git", "rev-parse", "--git-dir"],
cwd=path, capture_output=True, text=True
)
return result.returncode == 0
except:
return False
def _get_branch_info(self, path: str) -> Dict[str, str]:
"""Get current branch information."""
try:
branch_result = subprocess.run(
["git", "branch", "--show-current"],
cwd=path, capture_output=True, text=True
)
return {
"current_branch": branch_result.stdout.strip(),
"status": "active"
}
except:
return {"error": "Could not determine branch info"}
def _get_recent_commits(self, path: str, limit: int = 5) -> List[Dict[str, str]]:
"""Get recent commit information."""
try:
result = subprocess.run([
"git", "log", "--oneline", f"-{limit}",
"--pretty=format:%h|%an|%ar|%s"
], cwd=path, capture_output=True, text=True)
commits = []
for line in result.stdout.strip().split('\n'):
if '|' in line:
hash_id, author, date, message = line.split('|', 3)
commits.append({
"hash": hash_id,
"author": author,
"date": date,
"message": message
})
return commits
except:
return [{"error": "Could not fetch commit history"}]
def _get_status(self, path: str) -> Dict[str, Any]:
"""Get git status information."""
try:
result = subprocess.run(
["git", "status", "--porcelain"],
cwd=path, capture_output=True, text=True
)
modified_files = []
for line in result.stdout.strip().split('\n'):
if line.strip():
status, filename = line[:2], line[3:]
modified_files.append({
"file": filename,
"status": status.strip()
})
return {
"modified_files": modified_files,
"clean": len(modified_files) == 0
}
except:
return {"error": "Could not get status"}
def _get_remote_info(self, path: str) -> Dict[str, str]:
"""Get remote repository information."""
try:
result = subprocess.run(
["git", "remote", "get-url", "origin"],
cwd=path, capture_output=True, text=True
)
return {
"origin_url": result.stdout.strip(),
"has_remote": True
}
except:
return {"has_remote": False}
# Required: Command instance for the registry
command = GitCommand()
๐ Configuration File (config.json)
{
"name": "mycommand",
"description": "Custom context provider for specialized analysis",
"version": "1.0.0",
"author": "Your Name",
"category": "analysis",
"usage_examples": [
{
"command": "mycommand: sample input text",
"description": "Analyzes the provided text input"
},
{
"command": "mycommand: /path/to/file",
"description": "Analyzes content from a file path"
}
],
"requirements": [
"python>=3.8"
],
"optional_dependencies": [
"numpy",
"requests"
],
"tags": ["analysis", "custom", "text"]
}
๐ Using Your Custom Command
After creating your command, restart Bielik CLI to load it:
# Restart Bielik CLI
./run.py
# Your command will now be available:
mycommand: analyze this text
# Check if it's loaded:
:help
๐ Context Provider Command Features
Built-in capabilities:
- โ
Automatic loading from
commands/directory - โ Error handling and validation
- โ Help generation from docstrings and config
- โ Context integration with AI prompts
- โ
Argument parsing from
name: argsformat - โ Multiple command support in single session
What makes a good Context Provider:
- ๐ Structured output - Return organized, meaningful data
- ๐ฏ Specific purpose - Focus on one type of analysis
- โก Fast execution - Avoid long-running operations
- ๐ Clear documentation - Good descriptions and examples
- ๐ก๏ธ Error handling - Graceful failure with helpful messages
๐ Real-world Examples
Available Context Providers:
folder: . # Analyze directory structure and files
calc: 2+3*4 # Mathematical calculations
pdf: document.pdf # PDF document analysis
git: /path/to/repo # Git repository information (example above)
Example workflow:
๐ค You: folder: ~/projects
โ
Context loaded! Directory analysis complete.
๐ค You: What are the largest Python files in this project?
๐ค AI: [Uses folder context to identify and analyze Python files]
๐ค You: git: .
โ
Context loaded! Git repository analysis complete.
๐ค You: Summarize recent commits and current status
๐ค AI: [Uses git context to provide commit summaries and status]
๐ Development
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
python -m venv .venv
source .venv/bin/activate
pip install -e .
pip install -e .[local]
pip install -e .[dev]
pip install -e .[gpu]
pip install -e .[vision]
๐ณ Docker Testing Framework
Bielik CLI includes a comprehensive Docker testing framework for multiplatform installation verification across all major Linux distributions.
Available Docker Tests
# Complete multiplatform test suite (all distributions)
make docker-test
# Individual distribution tests
make docker-test-ubuntu # Ubuntu 22.04
make docker-test-debian # Debian 12
make docker-test-alpine # Alpine Linux 3.19
make docker-test-centos # CentOS Stream 9
make docker-test-arch # Arch Linux
make docker-test-oneliner # One-liner installation simulation
# Docker management
make docker-build # Build all test images
make docker-clean # Clean Docker artifacts
# Complete test suite (Python + Docker)
make test-all # Run both unit tests and Docker tests
What Docker Tests Verify
โ
Installation Success: Bielik CLI installs without errors
โ
Context Provider Commands: folder:, calc:, pdf: work correctly
โ
Cross-platform Compatibility: Works on Ubuntu, Debian, Alpine, CentOS, Arch
โ
One-liner Installation: curl | bash installation process
โ
Dependency Management: Python packages install correctly
โ
Virtual Environment: .venv setup and activation
Docker Test Architecture
docker/
โโโ test-multiplatform.yml # Docker Compose configuration
โโโ Dockerfile.test-ubuntu # Ubuntu 22.04 test environment
โโโ Dockerfile.test-debian # Debian 12 test environment
โโโ Dockerfile.test-alpine # Alpine Linux 3.19 test environment
โโโ Dockerfile.test-centos # CentOS Stream 9 test environment
โโโ Dockerfile.test-arch # Arch Linux test environment
โโโ Dockerfile.test-oneliner # One-liner installation test
Each test environment:
- Sets up clean Linux distribution
- Installs system dependencies (Python, git, build tools)
- Runs
python3 install.py --skip-aito install Bielik with Context Provider Commands - Verifies installation with
python3 run.py --info - Tests Context Provider Commands (folder analysis, calculations)
Running Docker Tests in CI/CD
# GitHub Actions example
- name: Run multiplatform Docker tests
run: make docker-test
- name: Run specific distribution test
run: make docker-test-ubuntu
๐ Package Structure
bielik/
โโโ bielik/
โ โโโ __init__.py # Package initialization
โ โโโ cli.py # CLI entry point (wrapper)
โ โโโ client.py # Client entry point (wrapper)
โ โโโ server.py # FastAPI web server
โ โโโ config.py # Configuration management
โ โโโ hf_models.py # Hugging Face model management
โ โโโ content_processor.py # Content processing utilities
โ โโโ cli/ # Modular CLI components
โ โ โโโ __init__.py
โ โ โโโ main.py # Main CLI entry and argument parsing
โ โ โโโ commands.py # Command processing and execution
โ โ โโโ command_api.py # Context Provider Commands API
โ โ โโโ models.py # HF model management CLI
โ โ โโโ setup.py # Interactive setup manager
โ โ โโโ send_chat.py # Chat communication handling
โ โโโ client/ # Modular client components
โ โโโ __init__.py # Client package exports
โ โโโ core.py # Core BielikClient class
โ โโโ model_manager.py # HF model operations for client
โ โโโ utils.py # Client utility functions
โโโ commands/ # Context Provider Commands
โ โโโ folder/ # Directory analysis command
โ โโโ calc/ # Advanced calculator command
โ โโโ pdf/ # Document processing command
โโโ docker/ # Docker testing framework
โ โโโ test-multiplatform.yml
โ โโโ Dockerfile.test-* # Test environments
โโโ tests/
โ โโโ __init__.py
โ โโโ test_cli.py # CLI unit tests (updated for local HF models)
โ โโโ test_server.py # Server unit tests
โโโ install.py # Universal Python installer
โโโ run.py # Universal launcher
โโโ quick-install.sh # Unix one-liner installer
โโโ quick-install.bat # Windows one-liner installer
โโโ pyproject.toml # Modern Python packaging
โโโ setup.cfg # Package configuration
โโโ MANIFEST.in # Package manifest
โโโ LICENSE # Apache 2.0 license
โโโ README.md # This comprehensive documentation
โโโ Makefile # Development automation with Docker tests
โโโ todo.md # Project specifications
โโโ .github/workflows/ # CI/CD automation
โโโ python-publish.yml
Author
Tom Sapletta โ DevOps Engineer & Systems Architect
๐ป 15+ years in DevOps, Software Development, and Systems Architecture
๐ข Founder & CEO at Telemonit (Portigen - edge computing power solutions)
๐ Based in Germany | Open to remote collaboration
๐ Passionate about edge computing, hypermodularization, and automated SDLC
๐ License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bielik-0.2.1.tar.gz.
File metadata
- Download URL: bielik-0.2.1.tar.gz
- Upload date:
- Size: 105.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d13284e004bb431569bdb259150e3ec1b680512d0fb93ceecfed87a5810d0822
|
|
| MD5 |
54ab0bdc94be2ded4e292df9446b2666
|
|
| BLAKE2b-256 |
92e263e65d806d6303376deeb73d79ae0fdd696a38cb75b17fb421e987e6ed23
|
File details
Details for the file bielik-0.2.1-py3-none-any.whl.
File metadata
- Download URL: bielik-0.2.1-py3-none-any.whl
- Upload date:
- Size: 56.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0a2c065f48c9390e2134f821aaaf41dbffec044affc02bd946112b2b96cd91d1
|
|
| MD5 |
ae3d5ed9f2fdf2e02c144e1bd8d42f4a
|
|
| BLAKE2b-256 |
7de688f063ec6d7619f8110df989cf551430544bc435278f6e86e825e0fc7ba5
|