Skip to main content

Bielik โ€” Local-first AI assistant with HuggingFace and SpeakLeash models

Project description

๐Ÿฆ… Bielik

Bielik is a groundbreaking Polish language model created by Speakleash - a foundation dedicated to the development of Polish artificial intelligence.

๐Ÿ‡ต๐Ÿ‡ฑ Bielik is available on huggingface.

Now you can test it directly in the shell, using several features that allow you to build a multi-agent environment in your company ๐Ÿš€

  • ๐ŸŽฏ HuggingFace Integration - Direct model downloads from HF Hub
  • ๐Ÿ’ฌ Polish Language Optimized - Built for Polish conversation and analysis
  • ๐Ÿ–ผ๏ธ Vision Capabilities - Image analysis and visual question answering
  • ๐Ÿ“ Document Processing - PDF, DOCX, web content analysis
  • ๐Ÿณ Docker Ready - Containerized testing environments
  • โšก Lightweight - Minimal (~50MB) or Full (~2GB) installation options

Author: Tom Sapletta
License: Apache-2.0

PyPI Python License Build Downloads Code style: flake8 Issues Stars Forks

๐Ÿ“‹ Navigation Menu


๐Ÿ—๏ธ Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                       ๐Ÿฆ… BIELIK SYSTEM                          โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚   ๐Ÿ–ฅ๏ธ  CLI Shell     โ”‚  ๐ŸŒ FastAPI Server   โ”‚  ๐Ÿณ Docker Tests   โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚  โ”‚ โ€ข Interactive   โ”‚โ”‚ โ”‚ โ€ข REST /chat      โ”‚ โ”‚ โ”‚ โ€ข Minimal     โ”‚ โ”‚
โ”‚  โ”‚ โ€ข Personalized  โ”‚โ”‚ โ”‚ โ€ข WebSocket /ws   โ”‚ โ”‚ โ”‚ โ€ข Full        โ”‚ โ”‚
โ”‚  โ”‚ โ€ข Multi-modal   โ”‚โ”‚ โ”‚ โ€ข Port 8000       โ”‚ โ”‚ โ”‚ โ€ข CI/CD       โ”‚ โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
            โ”‚                       โ”‚                       โ”‚
            โ–ผ                       โ–ผ                       โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                ๐Ÿค— HUGGINGFACE INTEGRATION                     โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚   Direct Downloads   โ”‚โ—„โ”€โ”€โ–บโ”‚     Local Model Execution   โ”‚  โ”‚
โ”‚  โ”‚ โ”Œโ”€ HF Hub API        โ”‚    โ”‚ โ”Œโ”€ Transformers Pipeline    โ”‚  โ”‚
โ”‚  โ”‚ โ””โ”€ Model Management  โ”‚    โ”‚ โ””โ”€ Vision Models (optional) โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                      โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                 ๐Ÿ“š POLISH LANGUAGE MODELS                     โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚ ๐Ÿค– Speakleash/Bielik Models (HuggingFace Hub)           โ”‚  โ”‚
โ”‚  โ”‚ ๐Ÿ”— Direct: HuggingFace โ†’ Local Storage โ†’ Execution      โ”‚  โ”‚
โ”‚  โ”‚ ๐ŸŽฏ Polish-optimized conversation and analysis           โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

terminal

๐Ÿค– About Bielik Model

Bielik is a groundbreaking Polish language model created by Speakleash - a foundation dedicated to the development of Polish artificial intelligence.

๐Ÿ”— External Dependencies & Links:

๐Ÿš€ How it Works:

  1. Bielik CLI connects directly to HuggingFace Hub
  2. Models are downloaded from Speakleash organization on HuggingFace
  3. Local execution uses Transformers library (optional for vision)
  4. Chat interface (CLI/Web) โ†’ Local Models โ†’ Polish responses
  5. Modular design supports text-only or full vision capabilities

๐Ÿ“Œ Features

  • ๐ŸŽฏ HuggingFace Integration โ€” Direct model downloads and management
    • ๐Ÿ” Model Discovery โ€” Browse available Polish language models
    • ๐Ÿ“ฆ Smart Downloads โ€” Automatic model caching and versioning
    • ๐Ÿš€ Auto-Switch โ€” Automatically switches to newly downloaded models
    • ๐Ÿ› ๏ธ Interactive โ€” User-friendly model selection on first startup
  • ๐Ÿ–ฅ๏ธ Enhanced CLI python -m bielik โ€” Personalized chat experience
    • ๐Ÿ“‹ Commands โ€” :help, :models, :download, :delete, :switch, :settings, :name
    • โš™๏ธ Personalization โ€” Custom user names, dynamic assistant names
    • ๐Ÿค— HF Management โ€” Direct HuggingFace model operations
    • ๐Ÿ“ Content Analysis โ€” Folder scanning, document processing
    • ๐ŸŒ Cross-platform โ€” Windows, macOS, Linux support
  • ๐Ÿ–ผ๏ธ Vision Capabilities (Full version only)
    • ๐Ÿ” Image Analysis โ€” Automatic image captioning in Polish
    • โ“ Visual QA โ€” Ask questions about images
    • ๐ŸŽจ Multi-modal โ€” Combined text and image understanding
    • โšก GPU Support โ€” Hardware acceleration for faster processing
  • ๐Ÿ Python API โ€” Programmatic access via BielikClient class
    • ๐Ÿ’ฌ Chat methods โ€” chat(), query(), conversation management
    • ๐Ÿ”ง Model control โ€” Download, switch, and manage models
    • ๐Ÿค— HF Models โ€” Full HuggingFace integration
    • ๐Ÿ“ค Export โ€” Conversation history in multiple formats
  • ๐ŸŒ Web Server (FastAPI on port 8000):
    • ๐Ÿ“ก REST โ€” POST /chat endpoint for JSON communication
    • โšก WebSocket โ€” WS /ws for real-time chat
    • ๐Ÿ–ผ๏ธ Multi-modal โ€” Support for text and image inputs
  • ๐Ÿณ Docker Support โ€” Complete containerized testing
    • ๐Ÿ“ฆ Minimal Version โ€” Lightweight text-only container (~50MB)
    • ๐ŸŽฏ Full Version โ€” Complete vision-enabled container (~2GB)
    • ๐Ÿ”ง CI/CD โ€” Automated testing environments

โšก Performance Optimized Installation

Bielik now features lazy loading and optimized startup for faster performance. We recommend using Conda for the most reliable installation experience.

๐Ÿš€ Installation Guide

๐Ÿ Conda Installation (Recommended)

Prerequisites

  • Install Miniconda or Anaconda
  • For GPU support: Install appropriate NVIDIA drivers and CUDA toolkit

1. Create and Activate Environment

# Create a new conda environment
conda create -n bielik python=3.11 -y
conda activate bielik

2. Install System Dependencies

conda install -c conda-forge -y \
    cmake \
    make \
    gcc_linux-64 \
    gxx_linux-64 \
    libgcc \
    libstdcxx-ng

3. Install PyTorch (CPU or GPU)

For CPU-only:

conda install -c pytorch -y \
    pytorch \
    torchvision \
    torchaudio \
    cpuonly

For NVIDIA GPU (CUDA 11.8):

conda install -c pytorch -y \
    pytorch \
    torchvision \
    torchaudio \
    pytorch-cuda=11.8 \
    -c nvidia

4. Install Bielik and Dependencies

# Install Hugging Face libraries
conda install -c huggingface -y \
    transformers \
    accelerate \
    sentencepiece \
    huggingface-hub \
    bitsandbytes

# Install llama-cpp-python via conda (recommended - avoids C++ compilation issues)
conda install -c conda-forge llama-cpp-python

# Install Bielik in development mode
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
conda develop .

5. Verify Installation

python scripts/verify_installation.py

โšก Optimizing Model Loading

Bielik now includes optimized model loading with a 10-second timeout and debug mode:

# Start Bielik with debug mode (shows detailed loading information)
BIELIK_DEBUG=1 bielik

# Set a custom timeout (in seconds)
BIELIK_LOAD_TIMEOUT=15 bielik

If a model takes longer than the timeout to load, the debugger will automatically activate and show detailed logs to help diagnose the issue.

๐Ÿ–ฅ๏ธ Platform-Specific Notes

Windows

# Use Anaconda Prompt with Administrator privileges
# Install Visual Studio Build Tools with C++ workload

macOS (Intel/Apple Silicon)

# For Apple Silicon with Metal acceleration:
conda install -c conda-forge llama-cpp-python

# Or use pip with Metal flags (if conda not available):
# CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 \
# pip install llama-cpp-python --no-cache-dir

Linux

# Install system dependencies
sudo apt-get update && sudo apt-get install -y \
    build-essential \
    libssl-dev \
    zlib1g-dev \
    libbz2-dev \
    libreadline-dev \
    libsqlite3-dev \
    wget \
    llvm \
    libncurses5-dev \
    libncursesw5-dev \
    xz-utils \
    tk-dev \
    libffi-dev \
    liblzma-dev \
    python-openssl \
    git

๐Ÿ› ๏ธ Troubleshooting

Model Loading Issues

If models are loading slowly or timing out:

  1. Check your internet connection
  2. Verify you have enough disk space (~10GB free recommended)
  3. Enable debug mode: BIELIK_DEBUG=1 bielik
  4. Try clearing the model cache: rm -rf ~/.cache/huggingface/hub

GPU Acceleration

To enable GPU acceleration, ensure you have:

  1. Latest NVIDIA drivers installed
  2. CUDA toolkit matching your PyTorch version
  3. Properly configured LD_LIBRARY_PATH

Common Errors

  • CUDA out of memory: Reduce batch size or use a smaller model
  • Missing libraries: Install required system dependencies
  • Permission errors: Run with appropriate permissions or use --user flag

๐Ÿ›  Manual Installation

# 1. Clone repository
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik

# 2. Run universal installer
python install.py

๐Ÿ”ง Troubleshooting

Common Issues

1. Conda Environment Issues

Problem: Conda command not found
Solution:

# Add Conda to PATH
export PATH="$HOME/miniconda3/bin:$PATH"
# Initialize Conda for your shell
eval "$(conda shell.bash hook)"

2. CUDA/GPU Support

Problem: CUDA not detected
Solution: Install CUDA toolkit and configure Conda:

conda install -c nvidia cuda-toolkit
conda install -c conda-forge cudatoolkit-dev

3. Slow Model Loading

Problem: First load is slow
Solution: This is normal for the first load. Subsequent loads will be faster thanks to lazy loading.

4. Missing Dependencies

Problem: Missing system libraries
Solution: Install required system packages:

# Ubuntu/Debian
sudo apt-get update && sudo apt-get install -y \
    build-essential \
    cmake \
    libopenblas-dev \
    liblapack-dev \
    libjpeg-dev

# CentOS/RHEL
sudo yum groupinstall -y "Development Tools"
sudo yum install -y cmake openblas-devel lapack-devel libjpeg-turbo-devel

Verifying Your Installation

Run the verification script to check for common issues:

python scripts/verify_installation.py

Getting Help

If you encounter any issues, please:

  1. Check the GitHub Issues for known problems
  2. Run the verification script and include its output when reporting issues
  3. Provide details about your system and the exact error message

๐Ÿš€ Performance Tips

  1. First Run: The first run will be slower as models are loaded and cached
  2. Subsequent Runs: Enjoy faster startup times thanks to lazy loading
  3. Memory Usage: Close other memory-intensive applications when running Bielik
  4. GPU Acceleration: For best performance, use a system with CUDA-compatible GPU

๐ŸชŸ Windows Users

REM Double-click install.bat or run:
install.bat

REM Launch Bielik:
run.bat

๐Ÿง Linux/macOS Users

# Run installer:
./install.sh

# Launch Bielik:
./run.sh

โš™๏ธ Installation Options

# Basic installation (recommended)
python install.py

# Skip AI models (fastest installation)
python install.py --skip-ai

# Use conda/mamba instead of pip
python install.py --conda

# Development installation
python install.py --dev

# Show all options
python install.py --help

๐ŸŽฏ What the Universal Installer Does:

  • โœ… Auto-detects your operating system and Python version
  • โœ… Creates isolated virtual environment
  • โœ… Installs all dependencies with smart fallback strategies
  • โœ… Attempts multiple llama-cpp-python installation methods
  • โœ… Creates platform-specific launcher scripts
  • โœ… Works even if AI models fail to install (Context Provider Commands still work)
  • โœ… Provides clear next steps and troubleshooting

๐Ÿš€ After Installation

# Universal launcher (any platform)
python run.py

# Platform-specific launchers
./run.sh        # Linux/macOS
run.bat         # Windows

# Or direct activation
.venv/bin/python -m bielik.cli.main     # Linux/macOS
.venv\Scripts\python -m bielik.cli.main # Windows

๐Ÿ“ฆ Alternative: PyPI Installation

For users who prefer traditional pip installation:

๐Ÿชถ Minimal Version (Text-only)

# Install minimal version (text-only)
pip install bielik

# Start CLI and download your first model
python -m bielik

What's included:

  • โœ… Polish conversation and text analysis
  • โœ… HuggingFace model downloads and management
  • โœ… Document processing (PDF, DOCX, TXT)
  • โœ… Web content analysis
  • โœ… Context Provider Commands (folder:, calc:, pdf:)
  • โœ… Personalized CLI experience
  • โŒ Local AI models (requires manual llama-cpp-python installation)

๐ŸŽฏ Full Version (With vision)

# Install full version (text + vision)
pip install bielik[vision]

# Start CLI with image analysis support
python -m bielik

What's included:

  • โœ… Everything from minimal version
  • โœ… Image analysis and captioning
  • โœ… Visual question answering
  • โœ… GPU acceleration support
  • โœ… Multi-modal document processing

๐Ÿ”„ Upgrade Options

# Upgrade minimal โ†’ full
pip install bielik[vision]

# Or install specific optional features
pip install bielik[local]    # Local model execution
pip install bielik[gpu]      # GPU acceleration
pip install bielik[dev]      # Development tools
pip install bielik[vision]   # Vision capabilities

๐Ÿš€ Quick Start Guide

๐ŸŽฏ Instant Start (No setup required!)

Bielik now works without any external dependencies. Just install and start chatting:

# Install minimal version
pip install bielik

# Start CLI and choose your first model
python -m bielik

What happens on first run:

  • ๐Ÿ” Model Selection โ€” Choose from available Polish models
  • ๐Ÿ“ฅ Auto Download โ€” Selected model downloads from HuggingFace
  • ๐Ÿ”„ Auto Switch โ€” Automatically switches to the new model
  • ๐Ÿ’ฌ Ready to Chat โ€” Start conversing in Polish immediately!

๐Ÿ“ฑ Choose Your Experience

๐Ÿชถ Minimal Setup (Recommended)

Perfect for text conversations and document analysis:

# 1. Install
pip install bielik

# 2. Start and download your first model
python -m bielik
:download speakleash/bielik-4.5b-v3.0-instruct

# 3. Start chatting in Polish!
Czeล›ฤ‡! Jak mogฤ™ Ci pomรณc?

๐ŸŽฏ Full Setup (For image analysis)

Includes vision capabilities for image analysis:

# 1. Install with vision support
pip install bielik[vision]

# 2. Start and download models
python -m bielik
:download speakleash/bielik-4.5b-v3.0-instruct

# 3. Analyze images
Przeanalizuj to zdjฤ™cie: image.jpg

๐Ÿณ Docker Setup (For testing)

Use Docker for isolated testing environments:

# Clone repository
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik

# Test minimal version
docker-compose --profile minimal up bielik-minimal

# Test full version  
docker-compose --profile full up bielik-full

๐ŸŽฏ Context Provider Commands

Bielik CLI features a standardized Context Provider Command system that allows you to generate structured data for AI analysis. All external commands use the format name: (with colon after the command name).

๐Ÿ“‹ Available Commands

Command Format Description Example
๐Ÿ“ Folder Analysis folder: <path> Directory scanning and file analysis folder: ~/Documents
๐Ÿงฎ Calculator calc: <expression> Mathematical calculations and evaluations calc: 2 + 3 * 4
๐Ÿ“„ Document Reader pdf: <file> Text extraction from PDF, DOCX, TXT files pdf: report.pdf

๐Ÿ’ก How Context Provider Commands Work

  1. Generate Context: Commands analyze input and create structured data
  2. AI Integration: Data is formatted for AI consumption and analysis
  3. Independent Operation: Commands work without AI models installed
  4. Project Integration: Artifacts are automatically saved to your active project
  5. Standardized Format: All external commands use name: format for consistency

๐Ÿš€ Usage Examples

# Directory Analysis
bielik -p "folder: ."
bielik -p "folder: ~/Documents --recursive"

# Mathematical Calculations
bielik -p "calc: 2 + 3 * 4"
bielik -p "calc: sqrt(16) + sin(pi/2)"
bielik -p "calc: factorial(5)"

# Document Processing
bielik -p "pdf: document.pdf"
bielik -p "pdf: report.docx --metadata"
bielik -p "pdf: file.pdf --pages=1-5"

โšก Interactive Usage

In interactive mode, Context Provider Commands load data for AI analysis:

python -m bielik

# Load directory context
> folder: ~/projects

โœ… Context loaded! You can now ask questions about the provided data.

# Ask AI about the loaded context
> What files are in this directory?
> Which file is the largest?
> Are there any Python files?

๐Ÿš€ Project Management System

Bielik features a comprehensive session-based project management system that organizes your analysis artifacts into beautiful, shareable HTML projects.

๐ŸŽฏ Key Features

  • ๐Ÿ—‚๏ธ Multi-Project Sessions - Work on multiple projects simultaneously
  • ๐Ÿ“„ HTML Artifact Generation - Beautiful, interactive project representations
  • ๐Ÿ”„ Automatic Artifact Collection - Context Provider Commands auto-save to active project
  • ๐ŸŒ Browser-Friendly Viewing - Projects open in your default browser
  • ๐Ÿ“Š Rich Metadata - Embedded project information and artifact tracking
  • โœ… Validation System - Comprehensive integrity checking

๐Ÿ“ Project Management Commands

Command Description Example
:project create <name> Create new project :project create "Data Analysis"
:project create <name> <desc> --tags <tags> Create with metadata :project create "ML Project" "Machine learning analysis" --tags ml,data
:project switch <id|name> Switch to project :project switch data-analysis
:project list List all projects :project list
:project info [id] Show project details :project info
:project open [id] Open in browser :project open
:project validate [id] Validate project :project validate

๐ŸŽจ Project Workflow

# 1. Create a new project
python -m bielik
> :project create "Website Analysis" "Analyzing project structure"

โœ… Project Created Successfully
๐ŸŽฏ Project is now active. Use Context Provider Commands to add artifacts.

# 2. Add artifacts using Context Provider Commands
> folder: ~/my-website
๐ŸŽฏ Artifact Added to Project: Website Analysis

> calc: file_count * average_size / 1024
๐ŸŽฏ Artifact Added to Project: Website Analysis

> pdf: documentation.pdf  
๐ŸŽฏ Artifact Added to Project: Website Analysis

# 3. View your project
> :project open
๐ŸŒ Project Opened in Browser

# 4. Switch between projects
> :project list
๐Ÿ“‹ Bielik Projects (Current Session)

๐ŸŽฏ **ACTIVE** Website Analysis
   ๐Ÿ†” ID: abc123ef...
   ๐Ÿ“Š Artifacts: 3
   ๐Ÿ“… Created: 2024-12-20 14:30

๐Ÿ“ Data Analysis
   ๐Ÿ†” ID: def456gh...
   ๐Ÿ“Š Artifacts: 5  
   ๐Ÿ“… Created: 2024-12-20 12:15

> :project switch def456gh
โœ… Switched to Project: Data Analysis

๐Ÿ—๏ธ HTML Project Structure

Each project generates a beautiful HTML file with:

  • ๐Ÿ“Š Interactive Dashboard - Project overview with metadata
  • ๐ŸŽจ Beautiful Design - Modern, responsive interface
  • ๐Ÿ“‘ Artifact Viewer - Organized display of all artifacts
  • ๐Ÿ” Rich Metadata - Embedded in HTML attributes for programmatic access
  • ๐ŸŒ Offline Capability - Works without internet connection

Example Project Structure:

./bielik_projects/
โ”œโ”€โ”€ abc123ef-ghij-klmn-opqr-stuvwxyz0123/
โ”‚   โ”œโ”€โ”€ index.html          # Interactive project page
โ”‚   โ””โ”€โ”€ metadata.json       # Project metadata
โ””โ”€โ”€ def456gh-ijkl-mnop-qrst-uvwxyz012345/
    โ”œโ”€โ”€ index.html
    โ””โ”€โ”€ metadata.json

๐Ÿ” Validation System

Bielik includes comprehensive validators for:

  • ๐Ÿ  HTML Artifacts - Structure, metadata, and compliance checking
  • โš™๏ธ Environment Files - .env configuration validation
  • ๐Ÿ“œ Command Scripts - Code quality and compliance verification
# Validate current project
> :project validate
๐Ÿ” Project Validation Results
๐Ÿ“Š Status: โœ… VALID (0 errors, 0 warnings)

# Validate specific project
> :project validate abc123ef

๐Ÿ’ป Usage

๐Ÿ–ฅ๏ธ CLI Commands & Options

Starting Bielik

# Basic usage
python -m bielik                         # Start interactive chat
bielik                                   # Alternative (if in PATH)

# Advanced options  
python -m bielik --help                  # Show all options

Interactive Commands (inside chat session)

๐Ÿ“‹ Model Management:

:models                    # List available HuggingFace models
:download <model-name>     # Download model from HuggingFace
:switch <model-name>       # Switch to downloaded model
:delete <model-name>       # Delete model from local storage

โš™๏ธ Personalization:

:name <your-name>          # Set your display name
:settings                  # Show current configuration

๐Ÿ› ๏ธ Utilities:

:help                      # Show all commands
:clear                     # Clear conversation history
:exit                      # Quit (or Ctrl+C)

Usage Examples

First Time Setup:

$ python -m bielik
# Choose from recommended models:
# 1. speakleash/bielik-4.5b-v3.0-instruct (Recommended)
# 2. speakleash/bielik-7b-instruct-v0.1
# Enter choice: 1

๐Ÿ”„ Downloading speakleash/bielik-4.5b-v3.0-instruct...
โœ… Model ready! Switching to bielik-4.5b-v3.0-instruct

๐Ÿ‘ค You: Czeล›ฤ‡! Jak siฤ™ masz?
๐Ÿค– bielik-4.5b: Czeล›ฤ‡! Mam siฤ™ dobrze, dziฤ™kujฤ™...

Everyday Usage:

๐Ÿ‘ค You: :name Jan
โœ… Display name set to: Jan

๐Ÿ‘ค Jan: Przeanalizuj folder ~/dokumenty
๐Ÿค– bielik-4.5b: [Analyzes folder structure and contents]

๐Ÿ‘ค Jan: Opisz to zdjฤ™cie: vacation.jpg  # (Full version only)
๐Ÿค– bielik-4.5b: [Describes image in Polish]

๐Ÿ Python API

Use Bielik programmatically in your Python applications:

from bielik.client import BielikClient

# Create client (auto-downloads model if needed)
client = BielikClient()

# Send a Polish message
response = client.chat("Napisz krรณtki wiersz o Polsce")
print(response)

# Get model status
status = client.get_status()
print(f"Current model: {status['current_model']}")
print(f"Models available: {status['models_available']}")

# Export conversation
history = client.export_conversation(format="markdown")

Quick functions:

from bielik.client import quick_chat

# One-off Polish query
response = quick_chat("Co to jest sztuczna inteligencja?")
print(response)

BielikClient Options:

  • model: Specific HuggingFace model to use
  • auto_download: Auto-download model if missing (default: True)

๐ŸŒ Web Server

# Start web server
uvicorn bielik.server:app --port 8000

# Or with Docker
docker-compose --profile minimal up bielik-minimal

Endpoints:

  • POST /chat - JSON chat endpoint
  • WS /ws - WebSocket real-time chat
  • GET /models - List available models

Example request:

{"messages": [{"role":"user","content":"Czeล›ฤ‡! Jak siฤ™ masz?"}]}

๐Ÿณ Docker Usage

Quick Testing:

# Test minimal version
docker run -it bielik:minimal

# Test full version with GPU
docker run --gpus all -it bielik:full

With persistent storage:

# Minimal with model persistence
docker run -it -v $(pwd)/models:/app/models bielik:minimal

# Full with models and images
docker run -it \
  -v $(pwd)/models:/app/models \
  -v $(pwd)/images:/app/images \
  bielik:full

Development setup:

# Clone and test
git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik

# Run automated tests
docker-compose --profile test up test-runner

# Interactive development
docker-compose --profile minimal run --rm bielik-minimal bash

๐Ÿ”ง Environment Variables

Core Settings:

  • BIELIK_CLI_USERNAME โ€” Your display name in CLI (auto-detected from system)
  • BIELIK_CLI_CURRENT_MODEL โ€” Currently selected model
  • BIELIK_CLI_ASSISTANT_NAME โ€” Assistant display name (auto-set from model)
  • BIELIK_CLI_AUTO_SWITCH โ€” Auto-switch to newly downloaded models (default: true)

Storage & Cache:

  • BIELIK_MODELS_DIR โ€” Local model storage directory
  • BIELIK_DATA_DIR โ€” User data and settings directory
  • HF_HOME โ€” HuggingFace cache directory

Docker Environment:

  • BIELIK_MODE โ€” minimal or full (Docker only)
  • BIELIK_IMAGES_DIR โ€” Images directory for analysis (full version)

๐Ÿ› ๏ธ Troubleshooting

Installation Issues

Problem: Minimal version works but vision features don't

# Upgrade to full version
pip install bielik[vision]

# Verify vision packages
python -c "import PIL, transformers; print('Vision packages OK')"

Problem: Model download fails or times out

# Check HuggingFace connectivity
python -c "from huggingface_hub import HfApi; print('HF connection OK')"

# Check available disk space (models are 2-8GB)
df -h

# Manual download with timeout
:download speakleash/bielik-4.5b-v3.0-instruct

Problem: "No models available" on first startup

# Download a model manually
python -m bielik
:models
:download speakleash/bielik-4.5b-v3.0-instruct

Runtime Issues

Problem: Model responses are slow or use too much memory

# Use smaller model
:switch speakleash/bielik-4.5b-v3.0-instruct  # instead of 7b

# Check system resources
htop      # Linux/macOS
taskmgr   # Windows

# Enable GPU acceleration (full version)
pip install bielik[gpu]

Problem: Image analysis not working

# Check if vision packages installed
python -c "from bielik.image_analyzer import ImageAnalyzer; ia = ImageAnalyzer(); print(f'Available: {ia.is_available()}')"

# Install vision support
pip install bielik[vision]

Problem: CLI settings not persisting

# Check .env file creation
ls -la ~/.bielik/ || ls -la ./

# Reset settings
:name YourName
:settings

Docker Issues

Problem: Docker containers fail to start

# Build images manually
docker build -f docker/Dockerfile.minimal -t bielik:minimal .
docker build -f docker/Dockerfile.full -t bielik:full .

# Check container logs
docker logs bielik-minimal

Problem: Models not persisting between Docker runs

# Use volume mounts
docker run -v $(pwd)/models:/app/models bielik:minimal

# Or use Docker Compose
docker-compose --profile minimal up

Getting Help

  • GitHub Issues: Report bugs and feature requests
  • Command Help: python -m bielik --help or :help in CLI
  • Test Environment: Use Docker for isolated testing
  • Check Status: Use :settings command for current configuration

๐Ÿ› ๏ธ Creating Custom Context Provider Commands

Bielik CLI supports extensible Context Provider Commands that allow you to create custom name: commands that generate structured context data for AI analysis.

๐Ÿ“‹ Quick Setup Guide

1. Create command directory:

mkdir -p commands/mycommand
cd commands/mycommand

2. Create your command files:

  • main.py - Your command implementation
  • config.json - Command configuration and metadata

๐Ÿ Python Code Examples

Basic Context Provider Command Template

๐Ÿ“„ Configuration File (config.json)

{
  "name": "mycommand",
  "description": "Custom context provider for specialized analysis",
  "version": "1.0.0",
  "author": "Your Name",
  "category": "analysis",
  "usage_examples": [
    {
      "command": "mycommand: sample input text",
      "description": "Analyzes the provided text input"
    },
    {
      "command": "mycommand: /path/to/file",
      "description": "Analyzes content from a file path"
    }
  ],
  "requirements": [
    "python>=3.8"
  ],
  "optional_dependencies": [
    "numpy",
    "requests"
  ],
  "tags": ["analysis", "custom", "text"]
}

๐Ÿš€ Using Your Custom Command

After creating your command, restart Bielik CLI to load it:

# Restart Bielik CLI
./run.py

# Your command will now be available:
mycommand: analyze this text

# Check if it's loaded:
:help

๐Ÿ” Context Provider Command Features

Built-in capabilities:

  • โœ… Automatic loading from commands/ directory
  • โœ… Error handling and validation
  • โœ… Help generation from docstrings and config
  • โœ… Context integration with AI prompts
  • โœ… Argument parsing from name: args format
  • โœ… Multiple command support in single session

What makes a good Context Provider:

  • ๐Ÿ“Š Structured output - Return organized, meaningful data
  • ๐ŸŽฏ Specific purpose - Focus on one type of analysis
  • โšก Fast execution - Avoid long-running operations
  • ๐Ÿ“ Clear documentation - Good descriptions and examples
  • ๐Ÿ›ก๏ธ Error handling - Graceful failure with helpful messages

๐ŸŒŸ Real-world Examples

Available Context Providers:

folder: .              # Analyze directory structure and files
calc: 2+3*4           # Mathematical calculations  
pdf: document.pdf     # PDF document analysis
git: /path/to/repo    # Git repository information (example above)

Example workflow:

๐Ÿ‘ค You: folder: ~/projects
โœ… Context loaded! Directory analysis complete.

๐Ÿ‘ค You: What are the largest Python files in this project?
๐Ÿค– AI: [Uses folder context to identify and analyze Python files]

๐Ÿ‘ค You: git: .
โœ… Context loaded! Git repository analysis complete.

๐Ÿ‘ค You: Summarize recent commits and current status
๐Ÿค– AI: [Uses git context to provide commit summaries and status]

๐Ÿ“ Development

git clone https://github.com/tom-sapletta-com/bielik.git
cd bielik
python -m venv .venv
source .venv/bin/activate
pip install -e .
pip install -e .[local]
pip install -e .[dev]
pip install -e .[gpu]
pip install -e .[vision]

๐Ÿณ Docker Testing Framework

Bielik CLI includes a comprehensive Docker testing framework for multiplatform installation verification across all major Linux distributions.

Available Docker Tests

# Complete multiplatform test suite (all distributions)
make docker-test

# Individual distribution tests
make docker-test-ubuntu    # Ubuntu 22.04
make docker-test-debian    # Debian 12
make docker-test-alpine    # Alpine Linux 3.19
make docker-test-centos    # CentOS Stream 9
make docker-test-arch      # Arch Linux
make docker-test-oneliner  # One-liner installation simulation

# Docker management
make docker-build          # Build all test images
make docker-clean          # Clean Docker artifacts

# Complete test suite (Python + Docker)
make test-all              # Run both unit tests and Docker tests

What Docker Tests Verify

โœ… Installation Success: Bielik CLI installs without errors
โœ… Context Provider Commands: folder:, calc:, pdf: work correctly
โœ… Cross-platform Compatibility: Works on Ubuntu, Debian, Alpine, CentOS, Arch
โœ… One-liner Installation: curl | bash installation process
โœ… Dependency Management: Python packages install correctly
โœ… Virtual Environment: .venv setup and activation

Docker Test Architecture

docker/
โ”œโ”€โ”€ test-multiplatform.yml     # Docker Compose configuration
โ”œโ”€โ”€ Dockerfile.test-ubuntu     # Ubuntu 22.04 test environment
โ”œโ”€โ”€ Dockerfile.test-debian     # Debian 12 test environment  
โ”œโ”€โ”€ Dockerfile.test-alpine     # Alpine Linux 3.19 test environment
โ”œโ”€โ”€ Dockerfile.test-centos     # CentOS Stream 9 test environment
โ”œโ”€โ”€ Dockerfile.test-arch       # Arch Linux test environment
โ””โ”€โ”€ Dockerfile.test-oneliner   # One-liner installation test

Each test environment:

  1. Sets up clean Linux distribution
  2. Installs system dependencies (Python, git, build tools)
  3. Runs python3 install.py --skip-ai to install Bielik with Context Provider Commands
  4. Verifies installation with python3 run.py --info
  5. Tests Context Provider Commands (folder analysis, calculations)

Running Docker Tests in CI/CD

# GitHub Actions example
- name: Run multiplatform Docker tests
  run: make docker-test

- name: Run specific distribution test
  run: make docker-test-ubuntu

๐Ÿ“‚ Package Structure

bielik/
โ”œโ”€โ”€ bielik/
โ”‚   โ”œโ”€โ”€ __init__.py          # Package initialization
โ”‚   โ”œโ”€โ”€ cli.py               # CLI entry point (wrapper)
โ”‚   โ”œโ”€โ”€ client.py            # Client entry point (wrapper)
โ”‚   โ”œโ”€โ”€ server.py            # FastAPI web server
โ”‚   โ”œโ”€โ”€ config.py            # Configuration management
โ”‚   โ”œโ”€โ”€ hf_models.py         # Hugging Face model management
โ”‚   โ”œโ”€โ”€ content_processor.py # Content processing utilities
โ”‚   โ”œโ”€โ”€ cli/                 # Modular CLI components
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”‚   โ”œโ”€โ”€ main.py          # Main CLI entry and argument parsing
โ”‚   โ”‚   โ”œโ”€โ”€ commands.py      # Command processing and execution
โ”‚   โ”‚   โ”œโ”€โ”€ command_api.py   # Context Provider Commands API
โ”‚   โ”‚   โ”œโ”€โ”€ models.py        # HF model management CLI
โ”‚   โ”‚   โ”œโ”€โ”€ setup.py         # Interactive setup manager
โ”‚   โ”‚   โ””โ”€โ”€ send_chat.py     # Chat communication handling
โ”‚   โ””โ”€โ”€ client/              # Modular client components
โ”‚       โ”œโ”€โ”€ __init__.py      # Client package exports
โ”‚       โ”œโ”€โ”€ core.py          # Core BielikClient class
โ”‚       โ”œโ”€โ”€ model_manager.py # HF model operations for client
โ”‚       โ””โ”€โ”€ utils.py         # Client utility functions
โ”œโ”€โ”€ commands/                # Context Provider Commands
โ”‚   โ”œโ”€โ”€ folder/             # Directory analysis command
โ”‚   โ”œโ”€โ”€ calc/               # Advanced calculator command
โ”‚   โ””โ”€โ”€ pdf/                # Document processing command
โ”œโ”€โ”€ docker/                 # Docker testing framework
โ”‚   โ”œโ”€โ”€ test-multiplatform.yml
โ”‚   โ””โ”€โ”€ Dockerfile.test-*   # Test environments
โ”œโ”€โ”€ tests/
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ test_cli.py          # CLI unit tests (updated for local HF models)
โ”‚   โ””โ”€โ”€ test_server.py       # Server unit tests
โ”œโ”€โ”€ install.py              # Universal Python installer
โ”œโ”€โ”€ run.py                  # Universal launcher
โ”œโ”€โ”€ quick-install.sh        # Unix one-liner installer
โ”œโ”€โ”€ quick-install.bat       # Windows one-liner installer
โ”œโ”€โ”€ pyproject.toml          # Modern Python packaging
โ”œโ”€โ”€ setup.cfg               # Package configuration
โ”œโ”€โ”€ MANIFEST.in             # Package manifest
โ”œโ”€โ”€ LICENSE                 # Apache 2.0 license
โ”œโ”€โ”€ README.md               # This comprehensive documentation
โ”œโ”€โ”€ Makefile                # Development automation with Docker tests
โ”œโ”€โ”€ todo.md                 # Project specifications
โ””โ”€โ”€ .github/workflows/      # CI/CD automation
    โ””โ”€โ”€ python-publish.yml

Author

Tom Sapletta โ€” DevOps Engineer & Systems Architect

๐Ÿ’ป 15+ years in DevOps, Software Development, and Systems Architecture
๐Ÿข Founder & CEO at Telemonit (Portigen - edge computing power solutions)
๐ŸŒ Based in Germany | Open to remote collaboration
๐Ÿ“š Passionate about edge computing, hypermodularization, and automated SDLC

๐Ÿ“œ License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bielik-0.2.7.tar.gz (105.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bielik-0.2.7-py3-none-any.whl (52.3 kB view details)

Uploaded Python 3

File details

Details for the file bielik-0.2.7.tar.gz.

File metadata

  • Download URL: bielik-0.2.7.tar.gz
  • Upload date:
  • Size: 105.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for bielik-0.2.7.tar.gz
Algorithm Hash digest
SHA256 caee5cb06e9696d50d6a38840d1b88e31e90bd27b89023a3f365d374e79cb88a
MD5 f6a1ea503d4d853de2b94ff86c5be2b4
BLAKE2b-256 4460b33751067bc7fa3caff26b1eeceaf0b168cce3a6143e42847d99805fd950

See more details on using hashes here.

File details

Details for the file bielik-0.2.7-py3-none-any.whl.

File metadata

  • Download URL: bielik-0.2.7-py3-none-any.whl
  • Upload date:
  • Size: 52.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for bielik-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 82527f98be41e8544d6513619fc802e2f61c8cdf84cf265eae70390eb6ec4194
MD5 12286a340a0b4af39dd75328a22c9aa5
BLAKE2b-256 2e71ea4ebaee4c613c8f9aa01cba4ebd9046c96ab0aa44df9de53feb61331306

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page