Lightweight agent runtime library for autonomous edge agents
Project description
๐ฆ OpenHoof v2.0
Local AI agent runtime library with FunctionGemma training
OpenHoof is a standalone, extensible library for running AI agents that persist across sessions, respond to events, and coordinate with each other. Built to work with LlamaFarm for local inference.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ YOUR APPLICATION โ
โ (Drone Control, Medical, etc.) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ โ
โ Agent Runtime โ
โ โผ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ O P E N H O O F โ
โ โโโโโโโโโโโ โโโโโโโโโโโ โโโโโโโโโโโ โโโโโโโโโโโ โ
โ โ Agent โ โ Soul โ โ Memory โ โ Tools โ โ
โ โ Loop โ โ Loading โ โ Recall โ โRegistry โ โ
โ โโโโโโฌโโโโโ โโโโโโฌโโโโโ โโโโโโฌโโโโโ โโโโโโฌโโโโโ โ
โ โ โ โ โ โ
โ โโโโโโโโโโโโโโดโโโโโโฌโโโโโโโดโโโโโโโโโโโโโ โ
โ โ โ
โ LlamaFarm โ
โ (local inference) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โจ What's New in v2.0
Complete rebuild as a standalone library (was a server framework in v1.x):
- ๐ฏ Agent Runtime โ Event loop, heartbeat, exit conditions
- ๐ง Context Files โ SOUL.md, MEMORY.md, USER.md, TOOLS.md as first-class citizens
- ๐พ DDIL โ Store-and-forward for offline operation (Denied/Degraded/Intermittent/Limited networks)
- ๐ LlamaFarm Integration โ Tools + prompts passed through in API calls
- ๐ก Training Data Capture โ Every tool call logged for fine-tuning
- ๐ FunctionGemma Pipeline โ Auto-generate training data, fine-tune tool routing model (GOLD!)
- ๐ฑ Runs Anywhere โ Python (now) โ Kotlin (Android) โ Rust (cross-platform)
๐ Quick Start
Installation
pip install -e git+https://github.com/llama-farm/openhoof.git@feat/microclaw-rebuild#egg=openhoof
Basic Usage
from openhoof import Agent, Soul, Memory
# Define your tools (OpenAI-compatible format)
tools = [
{
"name": "get_weather",
"description": "Get current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"}
},
"required": ["location"]
}
}
]
# Tool executor (your implementation)
def execute_tool(tool_name: str, params: dict) -> dict:
if tool_name == "get_weather":
return {"temp": 72, "condition": "sunny"}
return {"error": "Unknown tool"}
# Create agent
agent = Agent(
soul="SOUL.md", # Your agent's identity + mission
memory="MEMORY.md", # Long-term recall
tools=tools,
executor=execute_tool,
llamafarm_config="llamafarm.yaml", # LlamaFarm models
heartbeat_interval=30.0
)
# Exit conditions
agent.on_exit("timeout", lambda: time.time() - agent.start_time > 1800)
# Heartbeat callbacks
agent.on_heartbeat(lambda: print(f"๐ Still alive"))
# Run agent
agent.run()
๐ Core Concepts
Context Files
OpenHoof agents are defined by markdown files:
my-agent/
โโโ SOUL.md # Identity, mission, style, constraints
โโโ MEMORY.md # Long-term recall, persistent context
โโโ USER.md # Who the agent serves, preferences
โโโ TOOLS.md # Available capabilities, tool docs
SOUL.md example:
# SOUL.md - Weather Agent
You are a helpful weather assistant AI.
**Name:** WeatherBot
**Emoji:** โ๏ธ
## Mission
Provide accurate, timely weather information to users.
## Style
- Be concise and factual
- Always include units (ยฐF, mph, etc.)
- Warn about severe weather
Tool Schema Format
OpenHoof uses OpenAI-compatible tool schemas (same format Ace uses):
{
"name": "drone_takeoff",
"description": "Take off and hover at specified altitude",
"parameters": {
"type": "object",
"properties": {
"alt_m": {"type": "number", "description": "Altitude in meters"},
},
"required": ["alt_m"]
}
}
Heartbeat System
Agents run a heartbeat loop every N seconds:
# Check exit conditions
agent.on_exit("battery_low", lambda: agent.custom.get("battery") < 20)
agent.on_exit("timeout", lambda: runtime > 1800)
# Custom heartbeat actions
def heartbeat():
battery = get_battery()
agent.custom["battery"] = battery
if battery < 30:
print("โ ๏ธ Low battery!")
agent.on_heartbeat(heartbeat)
DDIL (Store-and-Forward)
When network is unavailable, agents buffer data locally:
# Store data when offline
agent.ddil.store("telemetry", {
"lat": 41.8781,
"lon": -87.6298,
"battery": 85
})
# Sync when network returns
agent.ddil.sync_to_server("http://gateway.local")
LlamaFarm Integration
Configure models in llamafarm.yaml:
endpoint: "http://localhost:8765/v1"
models:
router:
model: "functiongemma:270m" # Fast tool routing
temperature: 0.1
reasoning:
model: "qwen2.5:8b" # Agent reasoning
temperature: 0.7
fallback:
model: "gpt-4o-mini" # Cloud fallback
Use in agent:
# Reason about a situation (can trigger tool calls)
response = agent.reason("Should I continue if battery is 25%?")
print(response['content'])
๐ FunctionGemma Training Pipeline (THE GOLD!)
OpenHoof includes an automated training pipeline for fine-tuning FunctionGemma on your tool calling patterns.
How It Works
- Data Collection โ Every tool call (input โ tool selection โ result) logged as training data
- Synthetic Generation โ Auto-generate diverse examples for each tool
- LoRA Fine-tuning โ Train FunctionGemma-270M on your tools (<300ms routing)
- GGUF Export โ Export trained model for deployment
- Hot-swap โ Update LlamaFarm with new model
Usage
# Check training data status
python -m openhoof.training.pipeline status
# Generate synthetic training data
python -m openhoof.training.pipeline generate --count 100
# Run full training pipeline
python -m openhoof.training.pipeline run
# Export data for inspection
python -m openhoof.training.pipeline export
Training Data Format
{
"input": {
"user_message": "Check the weather in Chicago",
"tools": ["get_weather", "set_reminder", "search_web"]
},
"output": {
"tool_calls": [
{"name": "get_weather", "arguments": {"location": "Chicago"}}
]
},
"metadata": {
"source": "live_usage",
"timestamp": "2026-02-20T15:00:00"
}
}
This is logged automatically by TrainingDataCapture on every tool call.
๐๏ธ Project Structure
openhoof/
โโโ openhoof/ # Python library
โ โโโ agent.py # Core Agent class
โ โโโ soul.py # SOUL.md loading
โ โโโ memory.py # MEMORY.md + semantic search
โ โโโ heartbeat.py # Heartbeat + exit conditions
โ โโโ events.py # Event queue
โ โโโ ddil.py # Store-and-forward buffer
โ โโโ training.py # Training data capture
โ โโโ models.py # LlamaFarm integration
โ โโโ tools/ # Tool base classes + registry
โ โ โโโ base.py # Tool base class
โ โ โโโ registry.py # ToolRegistry
โ โ โโโ builtin/ # Built-in tools
โ โโโ tool_registry.py # Simple registry (for basic use)
โโโ training/ # FunctionGemma pipeline (THE GOLD!)
โ โโโ pipeline.py # Training pipeline orchestration
โ โโโ train_tool_router.py # LoRA fine-tuning script
โโโ examples/ # Example agents
โโโ tests/ # Unit tests
โโโ llamafarm.yaml # LlamaFarm config
๐ง Tool Schema (What Ace Uses)
OpenHoof defines a tool schema format that's 100% OpenAI-compatible. This is what Ace uses (not the library itself, just the format):
DRONE_TOOLS = [
{
"name": "drone_takeoff",
"description": "Take off and hover at specified altitude",
"parameters": {
"type": "object",
"properties": {
"alt_m": {"type": "number", "default": 15.0}
},
"required": []
}
},
{
"name": "drone_move",
"description": "Move relative to current position",
"parameters": {
"type": "object",
"properties": {
"north_m": {"type": "number", "default": 0.0},
"east_m": {"type": "number", "default": 0.0},
"up_m": {"type": "number", "default": 0.0},
"yaw_deg": {"type": "number", "default": 0.0}
},
"required": []
}
}
]
This format works with:
- FunctionGemma fine-tuning
- LlamaFarm tool calling
- OpenAI API (if using cloud fallback)
๐ฆ Installation & Development
# Clone the repo
git clone https://github.com/llama-farm/openhoof.git
cd openhoof
# Install in development mode
pip install -e .
# Run tests
pytest tests/
# Generate synthetic training data
python -m openhoof.training.pipeline generate --count 100
# Train FunctionGemma
python -m openhoof.training.pipeline run
๐ฏ Example: Drone Agent
from openhoof import Agent
from my_drone_tools import DRONE_TOOLS, DroneToolExecutor
agent = Agent(
soul="SOUL.md",
memory="MEMORY.md",
tools=DRONE_TOOLS,
executor=DroneToolExecutor(),
heartbeat_interval=30.0
)
# Exit on battery low or geofence breach
agent.on_exit("battery_low", lambda: agent.custom.get("battery") < 20)
agent.on_exit("geofence", lambda: not agent.custom.get("in_bounds"))
# Sync telemetry on heartbeat
def heartbeat():
telemetry = get_telemetry()
agent.custom["battery"] = telemetry.battery
agent.custom["in_bounds"] = telemetry.in_geofence
# Buffer telemetry for DDIL
agent.ddil.store("telemetry", telemetry.to_dict())
agent.on_heartbeat(heartbeat)
# Run agent
agent.run()
๐ Migration from v1.x
v1.x was a server (FastAPI + WebSockets + UI)
v2.0 is a library (standalone agent runtime)
If you were using v1.x:
- Server features โ Moved to separate project (TBD)
- Agent runtime โ Now a library you import
- Tool schemas โ 100% compatible, no changes needed
- Training pipeline โ Still here, improved!
๐ License
Apache 2.0
๐ Acknowledgments
- LlamaFarm โ Local LLM inference
- Built with โค๏ธ for anyone who needs reliable local AI agents
- Special thanks to Ace (drone agent) for validating the architecture
Ready to build agents that kick into action? ๐ฆ
No llamas were harmed in the making of this library. Several were bedazzled.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openhoof-2.0.0.tar.gz.
File metadata
- Download URL: openhoof-2.0.0.tar.gz
- Upload date:
- Size: 855.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
603a5a66a43c4b34f53e62c94a9830075066cf070ba2104f721b4440fd44dda0
|
|
| MD5 |
9f66c6515ae433be05cf25b81b0d66c1
|
|
| BLAKE2b-256 |
40d0427e07d74201ad6e1329fd70e8a38ae791e7836680a08a9e3562b29ecb86
|
File details
Details for the file openhoof-2.0.0-py3-none-any.whl.
File metadata
- Download URL: openhoof-2.0.0-py3-none-any.whl
- Upload date:
- Size: 47.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d970f6bf0f83ca4ea8551acc7619d6d9a05ef015e99889c0ae630254811723a7
|
|
| MD5 |
77a0787ca78d49020875567ada562b83
|
|
| BLAKE2b-256 |
2cb8ef29a0c1e85b00345417fd0bf0f2e3509dc05067314529a4c45af536a459
|