Skip to main content

Programmatic tool calling for your agent.

Project description

╔══════════════════════════════════════════════════════════╗
║   ██████╗ ██████╗  ██████╗  ██████╗ ████████╗ ██████╗    ║
║   ██╔══██╗██╔══██╗██╔═══██╗██╔════╝ ╚══██╔══╝██╔════╝    ║
║   ██████╔╝██████╔╝██║   ██║██║  ███╗   ██║   ██║         ║
║   ██╔═══╝ ██╔══██╗██║   ██║██║   ██║   ██║   ██║         ║
║   ██║     ██║  ██║╚██████╔╝╚██████╔╝   ██║   ╚██████╗    ║
║   ╚═╝     ╚═╝  ╚═╝ ╚═════╝  ╚═════╝    ╚═╝    ╚═════╝    ║
║                                           by capsa.ai    ║
╚══════════════════════════════════════════════════════════╝

Programmatic tool calling for your agent.

CI PyPI - Version PyPI - Python Version


What is Programmatic Tool Calling?

Programmatic Tool Calling is a strategy used to orchestrate an agent's tools through code rather than through individual API round-trips. Instead of your agent requesting tools one at a time with each result being returned to its context, your agent can write code that calls multiple tools, processes their outputs, and controls what information actually enters its context window.

Programmatic Tool Calling was popularised by the likes of smolagents and claude. progtc is a framework agnostic implementation.

The challenge that progtc solves is that, for security, your agent's code must be run in a sandboxed environment but typically your tools run locally. You therefore need a mechanism to communicate tool call requests and results to and from your sandbox.

Installation

pip install progtc # client only
pip install "progtc[server]" # with server

Or with uv:

uv add progtc # client only
uv add "progtc[server]" # with server

Quick Start

1. Start the Server (inside your sandbox)

progtc serve --host 0.0.0.0 --port 8000 --api-key your-secret-key

2. Execute Code from Your Client

from progtc import AsyncProgtcClient

client = AsyncProgtcClient(
    base_url="https://your-sandbox-url:8000",
    api_key="your-secret-key",
)

# Define your tools as async functions
async def get_weather(city: str, country: str) -> str:
    # Your actual implementation
    return f"Weather in {city}, {country}: Sunny, 22°C"

async def search_database(query: str) -> list[dict]:
    # Your actual implementation
    return [{"id": 1, "name": "Result"}]

# Execute LLM-generated code that uses your tools
code = """
from tools import get_weather

weather = await get_weather("London", "UK")
print(f"The weather is: {weather}")
"""

result = await client.execute_code(
    code=code,
    tools={
        "get_weather": get_weather,
        "search_database": search_database,
    },
)

print(result.stdout)  # "The weather is: Weather in London, UK: Sunny, 22°C"
print(result.stderr)  # ""

How It Works

sequenceDiagram
    box rgba(100, 100, 255, 0.2) Your App
        participant Client as Progtc Client
    end
    box rgba(100, 200, 100, 0.2) Code Sandbox
        participant Server as Progtc Server
        participant Process as Sub-Process
    end

    Client->>Server: POST /execute-code
    Server->>Process: code

    Note over Process: execute code

    Process->>Server: tool call
    Server->>Client: SSE: tool call

    activate Process
    Note over Process: paused

    Note over Client: execute tool locally

    Client->>Server: POST /tool-result
    deactivate Process
    Server->>Process: tool result

    Note over Process: continue execution...

    Process->>Server: stdout, stderr
    Server->>Client: SSE: stdout, stderr
  1. Your client sends code + a list of available tool names to the progtc server
  2. The server executes the code in an isolated process, injecting a tools module
  3. When code calls a tool, the server streams the call back to your client via SSE
  4. Your client executes the tool locally and sends the result back
  5. The server resumes code execution with the result
  6. Stdout/stderr are captured and streamed back when execution completes

Code Guidelines

To use tools your code should import them from the tools module:

from tools import my_tool

Tools are treated as async functions, therefore they must be awaited:

from tools import my_tool
await my_tool()

You will receive stdout and stderr, so print the variables you want to see:

from tools import tool_a, tool_b
a = tool_a()
b = tool_b(a)
print(b)

You can perform multiple tool calls at once using async gather:

from tools import get_weather, search_database
import asyncio

# Call tools like regular async functions
weather, results = await asyncio.gather(
    get_weather("Tokyo", "Japan"),
    search_database("hotels"),
)

print(f"Weather: {weather}")
print(f"Results: {results}")

Note: The code runs in a top-level async context, so you can use await directly without defining an async function.

Server CLI Options

progtc serve [OPTIONS]
Option Default Description
--host 127.0.0.1 Host to bind to
--port 8000 Port to bind to
--api-key (env: PROGTC_API_KEY) API key for authentication
--tool-call-timeout 10.0 Timeout for individual tool calls (seconds)
--code-execution-timeout 30.0 Total timeout for code execution (seconds)

Error Handling

The client returns a discriminated union—either success or one of several error types:

from progtc.types import MessageType

result = await client.execute_code(code, tools)

match result.type:
    case MessageType.SUCCESS:
        print(f"Stdout: {result.stdout}")
    case MessageType.SYNTAX_ERROR:
        print(f"Syntax error: {result.stderr}")
    case MessageType.RUNTIME_ERROR:
        print(f"Runtime error: {result.stderr}")
    case MessageType.TIMEOUT_ERROR:
        print(f"Timeout: {result.stderr}")

Example: Pydantic AI + E2B

See examples/e2b-example/ for a complete example using progtc with a pydantic-ai agent and an E2B sandbox.


Building AI agents? We're hiring: capsa.ai/careers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

progtc-0.1.11.tar.gz (102.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

progtc-0.1.11-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file progtc-0.1.11.tar.gz.

File metadata

  • Download URL: progtc-0.1.11.tar.gz
  • Upload date:
  • Size: 102.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.16 {"installer":{"name":"uv","version":"0.9.16","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for progtc-0.1.11.tar.gz
Algorithm Hash digest
SHA256 f384e1ce0040b70eab9faf220ddebb49037975cc5f7aaeab8d925c9fa5f84753
MD5 c9646d166d86ee10e48ff30c5a1ea83b
BLAKE2b-256 63b0fe8dd70fb12f20482bb16fbeee9693d38631246edde618b5c75afdb47d1a

See more details on using hashes here.

File details

Details for the file progtc-0.1.11-py3-none-any.whl.

File metadata

  • Download URL: progtc-0.1.11-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.16 {"installer":{"name":"uv","version":"0.9.16","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for progtc-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 51c46dc0c20ef7c22e1e84002ebd5e7840b350d73c250580c4c0c714158d2059
MD5 752975db3842dbd05428cce793bee06e
BLAKE2b-256 a1c11c245c53c5e08d5d49e05082ba24eabff2c2b814da6079a38a6f79a4b212

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page