A minimal, hackable agentic framework for Ollama and BitNet - local-first AI agent toolkit
Project description
🦞 LocalClaw R03
A minimal, hackable agentic framework engineered to run entirely locally with Ollama or BitNet.
Inspired by the architecture of OpenClaw, rebuilt from scratch for local-first operation.
📚 Documentation
| Document | Description |
|---|---|
| Architecture.md | Technical documentation for developers (directory structure, core design, orchestrator modes) |
| CHANGELOG.md | Version history and release notes (R00–R03) |
| TESTS.md | Benchmark results, model recommendations, and testing guide |
Installation
From PyPI (Recommended)
pip install localclaw
# Or install from GitHub for the latest development version:
pip install git+https://github.com/VTSTech/LocalClaw.git
From Source
git clone https://github.com/VTSTech/LocalClaw.git
cd LocalClaw
pip install -e .
No Installation Required
LocalClaw uses only Python stdlib — no dependencies! You can also just copy the localclaw directory into your project:
cp -r localclaw /path/to/your/project/
Quick Start
1. Single prompt
# Simple Q&A
localclaw run "What is the capital of Japan?"
# With streaming output
localclaw run "Tell me a joke." --stream
# Specify a model
localclaw run "Explain quantum computing" -m llama3.2:3b
2. Interactive chat
# Start interactive session
localclaw chat -m qwen2.5-coder:0.5b
# With tools enabled
localclaw chat -m llama3.1:8b --tools calculator,shell,read_file,write_file
# With skills loaded
localclaw chat -m llama3.2:3b --skills skill-creator --tools write_file,shell
# Fast mode (reduced context for speed)
localclaw chat -m qwen2.5-coder:0.5b --fast --verbose
3. Using BitNet backend
localclaw chat --backend bitnet --force-react
localclaw run "Calculate 17 * 23" --backend bitnet --tools calculator
Key Features
- Zero dependencies — uses Python stdlib only
- Ollama + BitNet backends — switch with
--backendflag - Native tool calling — auto-detected for supported models, ReAct fallback for others
- Agent Skills — follows Agent Skills specification
- Small model support — fuzzy matching, argument auto-fixing for models ≤1.5B params
- Built-in security — path validation, command blocklist, SSRF protection
CLI Commands
| Command | Description |
|---|---|
run "prompt" |
Run single prompt and exit |
chat |
Interactive multi-turn conversation |
models |
List available Ollama models |
tools |
List built-in tools |
skills |
List available Agent Skills |
test [example] |
Run example/test scripts (--list to see all) |
Key Flags
| Flag | Description |
|---|---|
-m, --model |
Model name (default: qwen2.5-coder:0.5b) |
--tools |
Comma-separated tool list |
--skills |
Comma-separated skill list |
--backend |
ollama or bitnet |
--stream |
Stream output token-by-token |
--fast |
Preset: reduced context for speed |
-v, --verbose |
Show tool calls and timing |
Built-in Tools
| Tool | Description |
|---|---|
calculator |
Evaluate math expressions |
python_repl |
Execute Python code |
shell |
Run shell commands |
read_file |
Read file contents |
write_file |
Write content to file |
list_directory |
List directory contents |
http_get |
HTTP GET request |
save_note / get_note |
Save and retrieve notes |
Configuration
| Variable | Description | Default |
|---|---|---|
OLLAMA_BASE_URL |
Ollama server URL | http://localhost:11434 |
BITNET_BASE_URL |
BitNet server URL | http://localhost:8765 |
LOCALCLAW_BACKEND |
Backend: ollama or bitnet |
ollama |
LOCALCLAW_MODEL |
Default model | qwen2.5-coder:0.5b-instruct-q4_k_m |
LOCALCLAW_SECURITY_MODE |
Security mode: strict, permissive, disabled |
permissive |
Setup Ollama
# Make sure Ollama is running:
ollama serve
# Pull a model:
ollama pull qwen2.5-coder:0.5b-instruct-q4_k_m
About
🦞 LocalClaw R03 is written and maintained by VTSTech.
- 🌐 Website: https://www.vts-tech.org
- 📦 GitHub: https://github.com/VTSTech/LocalClaw
- 💻 More projects: https://github.com/VTSTech
For more details, see:
- Architecture.md — Technical architecture and design decisions
- CHANGELOG.md — Version history and release notes
- TESTS.md — Benchmark results and model recommendations
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file localclaw-0.3.0.4.tar.gz.
File metadata
- Download URL: localclaw-0.3.0.4.tar.gz
- Upload date:
- Size: 163.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4ee4eaedac8146273c0a45df3996f100619ece06bb9f1c632096e3720f9780fd
|
|
| MD5 |
4e5231246ecf4431bae3e3b8d9f38e92
|
|
| BLAKE2b-256 |
4f16dc2944e0c3ff64b18adebb8b83e64d63f991fd26f61aebd7bade887c3626
|
File details
Details for the file localclaw-0.3.0.4-py3-none-any.whl.
File metadata
- Download URL: localclaw-0.3.0.4-py3-none-any.whl
- Upload date:
- Size: 213.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5e5f6fc55828e3d3e29ce91ebea76ceaaa21ab6ee3b23ccf5694430b722584da
|
|
| MD5 |
288e19f4dee0ede3a56fce0d4bcee221
|
|
| BLAKE2b-256 |
4c9092d0aa2908f4fd641392ace3ff77f839d1128c8532897e52b0d7480183d0
|