Skip to main content

A minimal, hackable agentic framework for Ollama and BitNet - local-first AI agent toolkit

Project description

🦞 LocalClaw R03

A minimal, hackable agentic framework engineered to run entirely locally with Ollama or BitNet.

Inspired by the architecture of OpenClaw, rebuilt from scratch for local-first operation.

Written by VTSTech · GitHub


📚 Documentation

Document Description
Architecture.md Technical documentation for developers (directory structure, core design, orchestrator modes)
CHANGELOG.md Version history and release notes (R00–R03)
TESTS.md Benchmark results, model recommendations, and testing guide

Installation

From PyPI (Recommended)

pip install localclaw

# Or install from GitHub for the latest development version:
pip install git+https://github.com/VTSTech/LocalClaw.git

From Source

git clone https://github.com/VTSTech/LocalClaw.git
cd LocalClaw
pip install -e .

No Installation Required

LocalClaw uses only Python stdlib — no dependencies! You can also just copy the localclaw directory into your project:

cp -r localclaw /path/to/your/project/

Quick Start

1. Single prompt

# Simple Q&A
localclaw run "What is the capital of Japan?"

# With streaming output
localclaw run "Tell me a joke." --stream

# Specify a model
localclaw run "Explain quantum computing" -m llama3.2:3b

2. Interactive chat

# Start interactive session
localclaw chat -m qwen2.5-coder:0.5b

# With tools enabled
localclaw chat -m llama3.1:8b --tools calculator,shell,read_file,write_file

# With skills loaded
localclaw chat -m llama3.2:3b --skills skill-creator --tools write_file,shell

# Fast mode (reduced context for speed)
localclaw chat -m qwen2.5-coder:0.5b --fast --verbose

3. Using BitNet backend

localclaw chat --backend bitnet --force-react
localclaw run "Calculate 17 * 23" --backend bitnet --tools calculator

Key Features

  • Zero dependencies — uses Python stdlib only
  • Ollama + BitNet backends — switch with --backend flag
  • Native tool calling — auto-detected for supported models, ReAct fallback for others
  • Agent Skills — follows Agent Skills specification
  • Small model support — fuzzy matching, argument auto-fixing for models ≤1.5B params
  • Built-in security — path validation, command blocklist, SSRF protection

CLI Commands

Command Description
run "prompt" Run single prompt and exit
chat Interactive multi-turn conversation
models List available Ollama models
tools List built-in tools
skills List available Agent Skills
test [example] Run example/test scripts (--list to see all)

Key Flags

Flag Description
-m, --model Model name (default: qwen2.5-coder:0.5b)
--tools Comma-separated tool list
--skills Comma-separated skill list
--backend ollama or bitnet
--stream Stream output token-by-token
--fast Preset: reduced context for speed
-v, --verbose Show tool calls and timing

Built-in Tools

Tool Description
calculator Evaluate math expressions
python_repl Execute Python code
shell Run shell commands
read_file Read file contents
write_file Write content to file
list_directory List directory contents
http_get HTTP GET request
save_note / get_note Save and retrieve notes

Configuration

Variable Description Default
OLLAMA_BASE_URL Ollama server URL http://localhost:11434
BITNET_BASE_URL BitNet server URL http://localhost:8765
LOCALCLAW_BACKEND Backend: ollama or bitnet ollama
LOCALCLAW_MODEL Default model qwen2.5-coder:0.5b-instruct-q4_k_m
LOCALCLAW_SECURITY_MODE Security mode: strict, permissive, disabled permissive

Setup Ollama

# Make sure Ollama is running:
ollama serve

# Pull a model:
ollama pull qwen2.5-coder:0.5b-instruct-q4_k_m

About

🦞 LocalClaw R03 is written and maintained by VTSTech.


For more details, see:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

localclaw-0.3.0.3.tar.gz (163.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

localclaw-0.3.0.3-py3-none-any.whl (214.1 kB view details)

Uploaded Python 3

File details

Details for the file localclaw-0.3.0.3.tar.gz.

File metadata

  • Download URL: localclaw-0.3.0.3.tar.gz
  • Upload date:
  • Size: 163.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for localclaw-0.3.0.3.tar.gz
Algorithm Hash digest
SHA256 5f9e89b0ef9d43823058160287b107812e150ed7a63c1ec4ee7c50846316096b
MD5 08cdee059a5f375f1ac4f512deb24f0b
BLAKE2b-256 967bb451d38cf07fe2a12979817e0bc2caddbb6cc736194f9c320693a1ad0634

See more details on using hashes here.

File details

Details for the file localclaw-0.3.0.3-py3-none-any.whl.

File metadata

  • Download URL: localclaw-0.3.0.3-py3-none-any.whl
  • Upload date:
  • Size: 214.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for localclaw-0.3.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ab75c0ffeadef8e4ed1e451aced4785cfc90c6fe2def5b92aebefa1eb6cf07a1
MD5 1dc453ae74938b37c4ef2fe8b37a387c
BLAKE2b-256 711eeb4e39f374d6857d06697372576f4c66e3102f37b3775c564fdec28e84ec

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page