Skip to main content

Minimal tooling primitives for LLM tool calling

Project description

Baretools AI

The un-framework for AI Engineers — Build AI Agents, Not Framework Wrappers.

PyPI version Release License: MIT

Status: Alpha. Documentation  ·  API Reference  ·  Why Baretools?  ·  Changelog


Why

Modern agent frameworks own your prompts, your orchestration, and your state. That trade-off is fine for demos and POCs, but in production it costs you the control needed to build your own Agent Harness. You also end up knowing the framework better than the underlying engineering — which is exactly backwards.

Baretools handles only the mechanical glue between your Python functions and the LLM:

  • Function → provider tool schema (OpenAI, Anthropic, Gemini, generic JSON Schema)
  • Parsing tool calls out of provider responses
  • Validating and executing those calls (sync, async, parallel, streaming)
  • Formatting results back into provider-shaped messages

Everything else — prompts, loops, retries, memory, guardrails — stays in your code.

Install

pip install baretools-ai

Pre-requisite:

  • Python >=3.10x
  • Optional Pydantic support: pip install "baretools-ai[pydantic]".

Quickstart

from baretools import tool, ToolRegistry, parse_tool_calls, format_tool_results
from openai import OpenAI

@tool
def get_weather(location: str) -> str:
    """Get current weather for a location."""
    return f"Sunny, 72°F in {location}"

tools = ToolRegistry()
tools.register(get_weather)

client = OpenAI()
messages = [{"role": "user", "content": "What's the weather in Paris?"}]

max_iterations = 5 # Can be very high in real world agents
iteration = 0
while iteration < max_iterations:
    response = client.chat.completions.create(
        model="gpt-4.1",
        messages=messages,
        tools=tools.get_schemas("openai"),
    )
    iteration += 1
    message = response.choices[0].message
    messages.append(message)

    tool_calls = parse_tool_calls(message, "openai")
    if not tool_calls:
        print("Final Response:", message.content)
        break

    results = tools.execute(tool_calls)
    messages.extend(format_tool_results(results, "openai"))

You write the loop. Baretools handles the schema, parsing, execution, and formatting on each side.

Runnable Examples

Working agents for each provider are in examples/:

OPENAI_API_KEY=...    uv run python examples/openai_agent.py
ANTHROPIC_API_KEY=... uv run python examples/anthropic_agent.py
GOOGLE_API_KEY=...    uv run python examples/gemini_agent.py

Features

  • Zero runtime dependencies — stdlib only; no transitive supply chain to audit
  • Multi-provider schemastools.get_schemas("openai" | "anthropic" | "gemini" | "json_schema")
  • Sync, async, streamingexecute, execute_async, execute_stream, execute_stream_async
  • Parallel tool execution — pass parallel=True with max_workers (sync) or max_concurrency (async) to fan out independent calls
  • Tool call hooksbefore_tool and after_tool callbacks let you plug in guardrails, redaction, auditing, or tracing around every call
  • Retries with structured events — pass on_event=... to observe attempts/retries/failures
  • Type-driven validationdataclasses work out of the box; pydantic BaseModels supported when installed
  • Provider-native parsing/formattingparse_tool_calls() and format_tool_results() for all four providers

See the docs site for the full API reference, advanced patterns, and design notes.

Development

uv sync --group dev
uv run ruff check .
uv run pytest -q

Contributing

Baretools is designed to be strictly minimal. Preferably, we only add features that are universally needed across tool-calling applications and difficult for developers to implement themselves. Please open an issue to discuss your idea before proposing a new feature.

License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

baretools_ai-0.4.4.tar.gz (13.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

baretools_ai-0.4.4-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file baretools_ai-0.4.4.tar.gz.

File metadata

  • Download URL: baretools_ai-0.4.4.tar.gz
  • Upload date:
  • Size: 13.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for baretools_ai-0.4.4.tar.gz
Algorithm Hash digest
SHA256 657239f5683b3144bb2967882e264afcb0a9de8f4acb8f22a47f2b178976b6fd
MD5 87401e40a721721be2c8bd6340df1b48
BLAKE2b-256 dbb0d4b778cd6c39938e8342c1d2e712ae7bcbc375333d0eab10ef3991a3362d

See more details on using hashes here.

Provenance

The following attestation bundles were made for baretools_ai-0.4.4.tar.gz:

Publisher: release.yml on ndamulelonemakh/baretools-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file baretools_ai-0.4.4-py3-none-any.whl.

File metadata

  • Download URL: baretools_ai-0.4.4-py3-none-any.whl
  • Upload date:
  • Size: 12.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for baretools_ai-0.4.4-py3-none-any.whl
Algorithm Hash digest
SHA256 cb38a20dc0ed42f84b04eb4a8ce5606848e52c6b8ed060534bb87cd0bdeaf73b
MD5 05615b82b82102ec4d50b1b5c1f2dfaa
BLAKE2b-256 bb5ef7c41bef10f249be90b9cd3eeadc6af6e6fe4db794dcff461a27e843bb9e

See more details on using hashes here.

Provenance

The following attestation bundles were made for baretools_ai-0.4.4-py3-none-any.whl:

Publisher: release.yml on ndamulelonemakh/baretools-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page