Skip to main content

A minimal, hackable agentic framework for Ollama and BitNet - local-first AI agent toolkit

Project description

🦞 LocalClaw R03

A minimal, hackable agentic framework engineered to run entirely locally with Ollama or BitNet.

Inspired by the architecture of OpenClaw, rebuilt from scratch for local-first operation.

Written by VTSTech · GitHub


📚 Documentation

Document Description
Architecture.md Technical documentation for developers (directory structure, core design, orchestrator modes)
CHANGELOG.md Version history and release notes (R00–R03)
TESTS.md Benchmark results, model recommendations, and testing guide

Installation

From PyPI (Recommended)

pip install localclaw

# Or install from GitHub for the latest development version:
pip install git+https://github.com/VTSTech/LocalClaw.git

From Source

git clone https://github.com/VTSTech/LocalClaw.git
cd LocalClaw
pip install -e .

No Installation Required

LocalClaw uses only Python stdlib — no dependencies! You can also just copy the localclaw directory into your project:

cp -r localclaw /path/to/your/project/

Quick Start

1. Single prompt

# Simple Q&A
localclaw run "What is the capital of Japan?"

# With streaming output
localclaw run "Tell me a joke." --stream

# Specify a model
localclaw run "Explain quantum computing" -m llama3.2:3b

2. Interactive chat

# Start interactive session
localclaw chat -m qwen2.5-coder:0.5b

# With tools enabled
localclaw chat -m llama3.1:8b --tools calculator,shell,read_file,write_file

# With skills loaded
localclaw chat -m llama3.2:3b --skills skill-creator --tools write_file,shell

# Fast mode (reduced context for speed)
localclaw chat -m qwen2.5-coder:0.5b --fast --verbose

3. Using BitNet backend

localclaw chat --backend bitnet --force-react
localclaw run "Calculate 17 * 23" --backend bitnet --tools calculator

Key Features

  • Zero dependencies — uses Python stdlib only
  • Ollama + BitNet backends — switch with --backend flag
  • Native tool calling — auto-detected for supported models, ReAct fallback for others
  • Agent Skills — follows Agent Skills specification
  • Small model support — fuzzy matching, argument auto-fixing for models ≤1.5B params
  • Built-in security — path validation, command blocklist, SSRF protection

CLI Commands

Command Description
run "prompt" Run single prompt and exit
chat Interactive multi-turn conversation
models List available Ollama models
tools List built-in tools
skills List available Agent Skills
test [example] Run example/test scripts (--list to see all)

Key Flags

Flag Description
-m, --model Model name (default: qwen2.5-coder:0.5b)
--tools Comma-separated tool list
--skills Comma-separated skill list
--backend ollama or bitnet
--stream Stream output token-by-token
--fast Preset: reduced context for speed
-v, --verbose Show tool calls and timing

Built-in Tools

Tool Description
calculator Evaluate math expressions
python_repl Execute Python code
shell Run shell commands
read_file Read file contents
write_file Write content to file
list_directory List directory contents
http_get HTTP GET request
save_note / get_note Save and retrieve notes

Configuration

Variable Description Default
OLLAMA_BASE_URL Ollama server URL http://localhost:11434
BITNET_BASE_URL BitNet server URL http://localhost:8765
LOCALCLAW_BACKEND Backend: ollama or bitnet ollama
LOCALCLAW_MODEL Default model qwen2.5-coder:0.5b-instruct-q4_k_m
LOCALCLAW_SECURITY_MODE Security mode: strict, permissive, disabled permissive

Setup Ollama

# Make sure Ollama is running:
ollama serve

# Pull a model:
ollama pull qwen2.5-coder:0.5b-instruct-q4_k_m

About

🦞 LocalClaw R03 is written and maintained by VTSTech.


For more details, see:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

localclaw-0.3.0.5.tar.gz (163.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

localclaw-0.3.0.5-py3-none-any.whl (213.6 kB view details)

Uploaded Python 3

File details

Details for the file localclaw-0.3.0.5.tar.gz.

File metadata

  • Download URL: localclaw-0.3.0.5.tar.gz
  • Upload date:
  • Size: 163.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for localclaw-0.3.0.5.tar.gz
Algorithm Hash digest
SHA256 6fd44b5cc42d03acea07cd563318fd013fba2ab032167c0b2f9dbd13b5ceb782
MD5 8c2c04db23144ffcd6c204257c30bc7e
BLAKE2b-256 3e24843793b6006b9644111545bd2fef21373dd200db9f579d55bbecf86fdca9

See more details on using hashes here.

File details

Details for the file localclaw-0.3.0.5-py3-none-any.whl.

File metadata

  • Download URL: localclaw-0.3.0.5-py3-none-any.whl
  • Upload date:
  • Size: 213.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for localclaw-0.3.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 b62daabe5d492cadc9c02333b13e968185ff157e6033e41f5ab1653cfb155c40
MD5 1e3affff4b8dda6260ca6ed3d10f45dc
BLAKE2b-256 258781d7dad644a95c750da23eaf092bf96287d4ad28ec554684027e43cf417c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page