Build agents your way using barebone primitives
Project description
barebone
LLM primitives for Python. Build agents your way.
import os
from barebone import Agent, tool
@tool
def get_weather(city: str) -> str:
return f"72°F in {city}"
api_key = os.environ["ANTHROPIC_API_KEY"]
agent = Agent("claude-sonnet-4", api_key=api_key, tools=[get_weather])
print(agent.run_sync("Weather in Tokyo?").content)
Install
pip install barebone
Quick Start
import os
from barebone import Agent, tool
@tool
def calculate(expression: str) -> str:
return str(eval(expression))
api_key = os.environ["ANTHROPIC_API_KEY"]
agent = Agent("claude-sonnet-4", api_key=api_key, tools=[calculate])
response = agent.run_sync("What is 123 * 456?")
print(response.content)
Agent
The Agent class handles the tool loop automatically:
import os
from barebone import Agent
api_key = os.environ["ANTHROPIC_API_KEY"]
agent = Agent(
"claude-sonnet-4",
api_key=api_key,
tools=[calculate, "Glob", "Read"], # Mix custom and built-in tools
system="You are a helpful assistant.",
max_turns=10, # Safety limit
)
# Sync
response = agent.run_sync("What files are here?")
# Async
response = await agent.run("What files are here?")
# Streaming
async for event in agent.stream("Write a poem"):
if hasattr(event, "text"):
print(event.text, end="")
Multi-turn Conversations
agent = Agent("claude-sonnet-4", api_key=api_key)
response = agent.run_sync("My name is Alice.")
response = agent.run_sync("What's my name?") # Remembers context
agent.clear_messages() # Reset conversation
Tools
@tool Decorator
from barebone import tool
@tool
def search(query: str, limit: int = 10) -> str:
return f"Found {limit} results for {query}"
@tool("custom_name")
def my_func(x: int) -> int:
return x * 2
Tool Class
For more control, use the Tool class:
from barebone import Tool, Param
class GetWeather(Tool):
"""Get weather for a city."""
city: str = Param(description="City name")
units: str = Param(default="fahrenheit")
def execute(self) -> str:
return f"72° in {self.city}"
Built-in Tools
from barebone import Read, Write, Edit, Bash, Glob, Grep
from barebone import WebFetch, WebSearch, HttpRequest
from barebone import Python
# Use by name with Agent
agent = Agent("claude-sonnet-4", api_key=api_key, tools=["Read", "Bash", "Glob"])
| Tool | Description |
|---|---|
Read |
Read files |
Write |
Write files |
Edit |
Find and replace |
Bash |
Run commands |
Glob |
Find files by pattern |
Grep |
Search file contents |
WebFetch |
Fetch web pages |
WebSearch |
Search the web |
HttpRequest |
HTTP requests |
Python |
Execute Python |
Hooks
Control tool execution:
from barebone import Agent, Hooks
hooks = Hooks()
@hooks.before
def log_call(tool_call):
print(f"Calling: {tool_call.name}")
@hooks.before
def block_dangerous(tool_call):
if tool_call.name == "Bash":
if "rm " in tool_call.arguments.get("command", ""):
raise Hooks.Deny("Dangerous command blocked")
@hooks.after
def log_result(tool_call, result):
print(f"Result: {result[:100]}")
agent = Agent("claude-sonnet-4", api_key=api_key, tools=["Bash"], hooks=hooks)
Primitives
For full control, use the primitives directly:
import os
from barebone import complete, execute, user, tool_result
api_key = os.environ["ANTHROPIC_API_KEY"]
tools = [GetWeather]
messages = [user("What's the weather in Paris?")]
while True:
response = complete("claude-sonnet-4", messages, api_key=api_key, tools=tools)
if not response.tool_calls:
print(response.content)
break
for tc in response.tool_calls:
result = execute(tc, tools)
messages.append(tool_result(tc, result))
Streaming
from barebone import astream, user, TextDelta, Done
async for event in astream("claude-sonnet-4", [user("Write a poem")], api_key=api_key):
if isinstance(event, TextDelta):
print(event.text, end="", flush=True)
elif isinstance(event, Done):
print(f"\n\nTokens: {event.response.usage.total_tokens}")
Structured Output
from pydantic import BaseModel
from barebone import complete, user
class Answer(BaseModel):
answer: str
confidence: float
response = complete(
"claude-sonnet-4",
[user("What is the capital of France?")],
api_key=api_key,
response_model=Answer,
)
print(response.parsed.answer) # "Paris"
print(response.parsed.confidence) # 0.99
Async Primitives
from barebone import acomplete, aexecute, astream
response = await acomplete("claude-sonnet-4", messages, api_key=api_key, tools=tools)
result = await aexecute(tool_call, tools)
async for event in astream("claude-sonnet-4", messages, api_key=api_key):
...
Memory
Persist conversations:
from barebone import Memory
memory = Memory("./chat.db") # SQLite
memory.log("user", "Hello")
memory.log("assistant", "Hi there!")
messages = memory.get_messages()
Authentication
Pass the API key explicitly:
import os
api_key = os.environ["ANTHROPIC_API_KEY"] # or OPENROUTER_API_KEY
agent = Agent("claude-sonnet-4", api_key=api_key)
API Reference
Agent
Agent(
model: str,
*,
api_key: str, # Required
tools: list = None, # Tool classes, @tool functions, or "Read"/"Bash"
system: str = None,
memory: Memory = None,
hooks: Hooks = None,
max_turns: int = 10,
)
| Method | Description |
|---|---|
run(prompt) |
Async tool loop, returns Response |
run_sync(prompt) |
Sync wrapper |
stream(prompt) |
Async generator yielding events |
clear_messages() |
Reset conversation |
add_tool(tool) |
Add tool dynamically |
| Property | Description |
|---|---|
messages |
Conversation history |
tools |
Resolved ToolDefs |
Primitives
| Function | Description |
|---|---|
complete(model, messages, **kwargs) |
Single LLM call |
acomplete(model, messages, **kwargs) |
Async LLM call |
stream(model, messages, **kwargs) |
Stream response (returns async iterator) |
astream(model, messages, **kwargs) |
Async stream |
execute(tool_call, tools) |
Execute tool |
aexecute(tool_call, tools) |
Async execute |
user(content) |
Create user message |
assistant(content) |
Create assistant message |
tool_result(tool_call, result) |
Create tool result |
complete/acomplete kwargs:
api_key— Required. Anthropic or OpenRouter API keysystem— System prompttools— List of toolsresponse_model— Pydantic model for structured outputmax_tokens— Max response tokens (default: 8192)temperature— Sampling temperaturetimeout— Timeout in seconds (raisesasyncio.TimeoutError)
Hooks
| Method | Description |
|---|---|
@hooks.before |
Before hook. Raise Deny to reject. |
@hooks.after |
After hook. Return value replaces result. |
hooks.run(tool_call, tools) |
Execute with hooks |
hooks.arun(tool_call, tools) |
Async execute with hooks |
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file barebone-0.1.3.tar.gz.
File metadata
- Download URL: barebone-0.1.3.tar.gz
- Upload date:
- Size: 34.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e40543695ecfb60f17f81615558427a9394dd1b154c1dec0160c601da3168153
|
|
| MD5 |
3aecff62938e30200b37e5be6f180ba6
|
|
| BLAKE2b-256 |
e41cf8cce4a502eb56fb9fa8cff15900d4ff7a274de92dd991910a1289e56343
|
File details
Details for the file barebone-0.1.3-py3-none-any.whl.
File metadata
- Download URL: barebone-0.1.3-py3-none-any.whl
- Upload date:
- Size: 32.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5e86035c7cfa033f76b419dda9214ada1bb160ce2de3456bc7206e2c79c6ccb8
|
|
| MD5 |
c0d9f0b0da6394703fcf3ab6d5543e21
|
|
| BLAKE2b-256 |
8ac511aca3de2fb775efd77e17d84522a5ffddc6e1e506943011ae235eab7fc4
|