Skip to main content

Minimal tooling primitives for LLM tool calling

Project description

Baretools AI

The un-framework for AI Engineers — Build AI Agents, Not Framework Wrappers.

PyPI version Release License: MIT

Status: Alpha. Documentation  ·  API Reference  ·  Why Baretools?  ·  Changelog


Why

Modern agent frameworks own your prompts, your orchestration, and your state. That trade-off is fine for demos and POCs, but in production it costs you the control needed to build your own Agent Harness. You also end up knowing the framework better than the underlying engineering — which is exactly backwards.

Baretools handles only the mechanical glue between your Python functions and the LLM:

  • Function → provider tool schema (OpenAI, Anthropic, Gemini, generic JSON Schema)
  • Parsing tool calls out of provider responses
  • Validating and executing those calls (sync, async, parallel, streaming)
  • Formatting results back into provider-shaped messages

Everything else — prompts, loops, retries, memory, guardrails — stays in your code.

Install

pip install baretools-ai

Pre-requisite:

  • Python >=3.10x
  • Optional Pydantic support: pip install "baretools-ai[pydantic]".

Quickstart

from baretools import tool, ToolRegistry, parse_tool_calls, format_tool_results
from openai import OpenAI

@tool
def get_weather(location: str) -> str:
    """Get current weather for a location."""
    return f"Sunny, 72°F in {location}"

tools = ToolRegistry()
tools.register(get_weather)

client = OpenAI()
messages = [{"role": "user", "content": "What's the weather in Paris?"}]

max_iterations = 5 # Can be very high in real world agents
iteration = 0
while iteration < max_iterations:
    response = client.chat.completions.create(
        model="gpt-4.1",
        messages=messages,
        tools=tools.get_schemas("openai"),
    )
    iteration += 1
    message = response.choices[0].message
    messages.append(message)

    tool_calls = parse_tool_calls(message, "openai")
    if not tool_calls:
        print("Final Response:", message.content)
        break

    results = tools.execute(tool_calls)
    messages.extend(format_tool_results(results, "openai"))

You write the loop. Baretools handles the schema, parsing, execution, and formatting on each side.

Runnable Examples

Working agents for each provider are in examples/:

OPENAI_API_KEY=...    uv run python examples/openai_agent.py
ANTHROPIC_API_KEY=... uv run python examples/anthropic_agent.py
GOOGLE_API_KEY=...    uv run python examples/gemini_agent.py

Features

  • Zero runtime dependencies — stdlib only; no transitive supply chain to audit
  • Multi-provider schemastools.get_schemas("openai" | "anthropic" | "gemini" | "json_schema")
  • Sync, async, streamingexecute, execute_async, execute_stream, execute_stream_async
  • Parallel tool execution — pass parallel=True with max_workers (sync) or max_concurrency (async) to fan out independent calls
  • Tool call hooksbefore_tool and after_tool callbacks let you plug in guardrails, redaction, auditing, or tracing around every call
  • Retries with structured events — pass on_event=... to observe attempts/retries/failures
  • Type-driven validationdataclasses work out of the box; pydantic BaseModels supported when installed
  • Provider-native parsing/formattingparse_tool_calls() and format_tool_results() for all four providers

See the docs site for the full API reference, advanced patterns, and design notes.

Development

uv sync --group dev
uv run ruff check .
uv run pytest -q

Contributing

Baretools is designed to be strictly minimal. Preferably, we only add features that are universally needed across tool-calling applications and difficult for developers to implement themselves. Please open an issue to discuss your idea before proposing a new feature.

License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

baretools_ai-0.4.3.tar.gz (13.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

baretools_ai-0.4.3-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file baretools_ai-0.4.3.tar.gz.

File metadata

  • Download URL: baretools_ai-0.4.3.tar.gz
  • Upload date:
  • Size: 13.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for baretools_ai-0.4.3.tar.gz
Algorithm Hash digest
SHA256 26bba1e1c4c33b6da6c3ce2f7bf23d790e917d10a4839a4a676f68b9ea64de30
MD5 3d7a0aad83c7fd6468c9576a4c4a81ba
BLAKE2b-256 f611a39bd1c8061e2e1e445cc11a434881a7424cbc37b916a0f94317a5f7e035

See more details on using hashes here.

Provenance

The following attestation bundles were made for baretools_ai-0.4.3.tar.gz:

Publisher: release.yml on ndamulelonemakh/baretools-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file baretools_ai-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: baretools_ai-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 12.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for baretools_ai-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ecf760a96b95077601a2c53fbad530108c56f73a63a8474c00458c5a6209c6d7
MD5 6760ffb2c1650382273b4e9288f621d1
BLAKE2b-256 122429ea16bcc98cfb37b606b550819f07d8854e44cc50acb261cfdfa6a37eb0

See more details on using hashes here.

Provenance

The following attestation bundles were made for baretools_ai-0.4.3-py3-none-any.whl:

Publisher: release.yml on ndamulelonemakh/baretools-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page