Intelligent LLM Execution Layer for developer teams — anonymize code, route models, solve tickets.
Project description
prollama
Intelligent LLM Execution Layer for developer teams.
Anonymize code → route to the cheapest capable model → solve tickets autonomously.
AI Cost Tracking
This project uses AI-generated code. Total cost: $0.4500 with 3 AI commits.
Generated on 2026-04-01 using openrouter/qwen/qwen3-coder-next
Why prollama?
AI coding tools send your code — secrets, business logic, customer names — to cloud LLMs. prollama sits between your tools and the LLM, stripping sensitive data before it leaves your machine.
Three layers of anonymization:
- Regex — API keys, tokens, connection strings, emails, IPs (70+ patterns, <1ms)
- NLP — Person names in comments, addresses, SSNs (heuristic or Presidio, ~5ms)
- AST — Class/function/variable names via tree-sitter (business logic hidden, ~20ms)
# BEFORE (what you write) # AFTER (what the LLM sees)
class StripePaymentProcessor: class Class_001:
def charge_customer(self, amount): def var_001(self, var_002):
key = "sk_live_4eC39HqL..." key = "[SECRET_001]"
return self.stripe.charge(amount) return self.var_003.var_004(var_002)
After the LLM responds, prollama reverses the mapping and returns code with your original names.
Quick Start
pip install prollama
prollama init # Create config
prollama solve "Fix the TypeError in auth.py" --file src/auth.py --dry-run
prollama anonymize src/payment.py --level full
prollama shell # Interactive REPL
Installation
pip install prollama # Core CLI + shell
pip install prollama[ast] # + tree-sitter AST anonymization (recommended)
pip install prollama[proxy] # + FastAPI proxy server
pip install prollama[nlp] # + Presidio ML-based PII detection
pip install prollama[all] # Everything
pip install prollama[dev] # + pytest, ruff, mypy
Features
CLI
| Command | Description |
|---|---|
prollama init |
Create config at ~/.prollama/config.yaml |
prollama start |
Start OpenAI-compatible proxy with anonymization |
prollama status |
Show config and provider status |
prollama solve DESC |
Solve a coding task via LLM orchestration |
prollama anonymize FILE |
Anonymize source code and show report |
prollama config show |
Show full config as JSON |
prollama shell |
Interactive REPL with tab-completion |
Interactive Shell
$ prollama shell
╭─ prollama shell v0.1.0 ──────────────────────────────────╮
│ Privacy: full │ Routing: cost-optimized │ Providers: ollama│
│ Type help for commands or describe a task to solve it. │
╰──────────────────────────────────────────────────────────╯
prollama ▸ Fix missing type hints in utils.py
Task: Fix missing type hints in utils.py
Type: fix Complexity: simple Model: qwen2.5-coder:7b
Solving...
✓ Solved in 1 iter, 2.3s, $0.0000
prollama ▸ history
prollama ▸ models
prollama ▸ cost
Proxy Server
prollama start
export OPENAI_BASE_URL=http://localhost:8741/v1
# Now every request from any OpenAI-compatible tool is anonymized automatically
Endpoints: /v1/chat/completions, /v1/models, /v1/anonymize, /health, /metrics
Model Routing
Cost-optimized escalation: cheap model → mid → premium → top.
| Tier | Examples | When |
|---|---|---|
| CHEAP | qwen2.5-coder:7b (local) | Typos, lint, formatting |
| MID | qwen2.5-coder:32b, DeepSeek | Bug fixes, error handling |
| PREMIUM | GPT-4o-mini, Claude Haiku | Refactors, new endpoints |
| TOP | GPT-4o, Claude Sonnet | Architecture, multi-file |
~60-80% of tasks resolve on cheap models, keeping costs minimal.
Architecture
┌───────────────────────────────────────────────────┐
│ prollama │
├──────────┬──────────────┬────────────┬────────────┤
│ Anonymizer│ Model Router │ Executor │ Proxy │
│ ┌────────┐│ ┌───────────┐│ ┌────────┐ │ ┌────────┐│
│ │ Regex ││ │ Classify ││ │ Solve │ │ │ /v1/ ││
│ │ NLP ││ │ Select ││ │ Iterate│ │ │ chat/ ││
│ │ AST ││ │ Escalate ││ │ Test │ │ │ compl. ││
│ └────────┘│ └───────────┘│ └────────┘ │ └────────┘│
└──────────┴──────────────┴────────────┴────────────┘
↕ ↕ ↕
tree-sitter LLM providers pytest/ruff
Presidio (Ollama, OpenAI, Anthropic)
Docker
docker compose up -d # prollama + Ollama
Development
git clone https://github.com/softreck/prollama.git
cd prollama
pip install -e ".[dev,ast]"
make test # 124 tests
make lint # ruff
make coverage # pytest-cov
make check # all of the above
Project Structure
prollama/
├── src/prollama/
│ ├── anonymizer/ # Three-layer anonymization pipeline
│ │ ├── regex_layer.py # Layer 1: 70+ secret/PII patterns
│ │ ├── nlp_layer.py # Layer 2: PII in comments (heuristic/Presidio)
│ │ ├── ast_layer.py # Layer 3: tree-sitter identifier renaming
│ │ └── pipeline.py # Orchestrator: regex → NLP → AST
│ ├── router/
│ │ └── model_router.py # Cost-optimized model selection + escalation
│ ├── executor/
│ │ └── task_executor.py # Full solve loop with test validation
│ ├── cli.py # Click CLI (8 commands)
│ ├── shell.py # Interactive REPL (prompt-toolkit)
│ ├── proxy.py # FastAPI OpenAI-compatible proxy
│ ├── config.py # YAML config with Pydantic
│ └── models.py # Domain models and enums
├── tests/ # 124 tests across 8 test files
├── examples/ # Sample code + runnable scripts
├── docs/ # Full documentation (10 pages)
├── .github/workflows/ci.yml # CI: Python 3.10-3.13 matrix
├── Dockerfile # Container image
├── docker-compose.yml # prollama + Ollama
├── Makefile # Development commands
├── CHANGELOG.md
├── CONTRIBUTING.md
└── LICENSE # Apache 2.0
Part of the pyqual Ecosystem
prollama is the fix provider for pyqual quality gate loops.
# pyqual.yaml
stages:
- name: fix
provider: prollama
strategy: auto
when: metrics_fail
License
Licensed under Apache-2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file prollama_dev-0.1.2.tar.gz.
File metadata
- Download URL: prollama_dev-0.1.2.tar.gz
- Upload date:
- Size: 70.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b8dc8859c5f7e9dc52cbe4fec6b187e69264993bd2183c146e7ea86318417d10
|
|
| MD5 |
6bde4a30543f833df228ee864567981f
|
|
| BLAKE2b-256 |
00a59a4a1a9d566f999fb6d8c6d303b1366717bc18686037ab98065f3f97e795
|
File details
Details for the file prollama_dev-0.1.2-py3-none-any.whl.
File metadata
- Download URL: prollama_dev-0.1.2-py3-none-any.whl
- Upload date:
- Size: 39.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b27ec87d129ef0b58882ab59645054a6a7472a05071e30a5378406b2ad217b16
|
|
| MD5 |
3cd9c7ff9dc1a6479c01e6d1d3c797a6
|
|
| BLAKE2b-256 |
a89054cf5e00aa14583e6a43a08918fdd50f6d5b14a4304741763ef683ec34fb
|