Programmatic tool calling for your agent.
Project description
╔══════════════════════════════════════════════════════════╗ ║ ██████╗ ██████╗ ██████╗ ██████╗ ████████╗ ██████╗ ║ ║ ██╔══██╗██╔══██╗██╔═══██╗██╔════╝ ╚══██╔══╝██╔════╝ ║ ║ ██████╔╝██████╔╝██║ ██║██║ ███╗ ██║ ██║ ║ ║ ██╔═══╝ ██╔══██╗██║ ██║██║ ██║ ██║ ██║ ║ ║ ██║ ██║ ██║╚██████╔╝╚██████╔╝ ██║ ╚██████╗ ║ ║ ╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═════╝ ║ ║ by capsa.ai ║ ╚══════════════════════════════════════════════════════════╝
Programmatic tool calling for your agent.
What is Programmatic Tool Calling?
Programmatic Tool Calling is a strategy used to orchestrate an agent's tools through code rather than through individual API round-trips. Instead of your agent requesting tools one at a time with each result being returned to its context, your agent can write code that calls multiple tools, processes their outputs, and controls what information actually enters its context window.
Programmatic Tool Calling was popularised by the likes of smolagents and claude. progtc is a framework agnostic implementation.
The challenge that progtc solves is that, for security, your agent's code must be run in a sandboxed environment but typically your tools run locally. You therefore need a mechanism to communicate tool call requests and results to and from your sandbox.
Installation
pip install progtc # client only
pip install "progtc[server]" # with server
Or with uv:
uv add progtc # client only
uv add "progtc[server]" # with server
Quick Start
1. Start the Server (inside your sandbox)
progtc serve --host 0.0.0.0 --port 8000 --api-key your-secret-key
2. Execute Code from Your Client
from progtc import AsyncProgtcClient
client = AsyncProgtcClient(
base_url="https://your-sandbox-url:8000",
api_key="your-secret-key",
)
# Define your tools as async functions
async def get_weather(city: str, country: str) -> str:
# Your actual implementation
return f"Weather in {city}, {country}: Sunny, 22°C"
async def search_database(query: str) -> list[dict]:
# Your actual implementation
return [{"id": 1, "name": "Result"}]
# Execute LLM-generated code that uses your tools
code = """
from tools import get_weather
weather = await get_weather("London", "UK")
print(f"The weather is: {weather}")
"""
result = await client.execute_code(
code=code,
tools={
"get_weather": get_weather,
"search_database": search_database,
},
)
print(result.stdout) # "The weather is: Weather in London, UK: Sunny, 22°C"
print(result.stderr) # ""
How It Works
sequenceDiagram
box rgba(100, 100, 255, 0.2) Your App
participant Client as Progtc Client
end
box rgba(100, 200, 100, 0.2) Code Sandbox
participant Server as Progtc Server
participant Process as Sub-Process
end
Client->>Server: POST /execute-code
Server->>Process: code
Note over Process: execute code
Process->>Server: tool call
Server->>Client: SSE: tool call
activate Process
Note over Process: paused
Note over Client: execute tool locally
Client->>Server: POST /tool-result
deactivate Process
Server->>Process: tool result
Note over Process: continue execution...
Process->>Server: stdout, stderr
Server->>Client: SSE: stdout, stderr
- Your client sends code + a list of available tool names to the progtc server
- The server executes the code in an isolated process, injecting a
toolsmodule - When code calls a tool, the server streams the call back to your client via SSE
- Your client executes the tool locally and sends the result back
- The server resumes code execution with the result
- Stdout/stderr are captured and streamed back when execution completes
Code Guidelines
To use tools your code should import them from the tools module:
from tools import my_tool
Tools are treated as async functions, therefore they must be awaited:
from tools import my_tool
await my_tool()
You will receive stdout and stderr, so print the variables you want to see:
from tools import tool_a, tool_b
a = tool_a()
b = tool_b(a)
print(b)
You can perform multiple tool calls at once using async gather:
from tools import get_weather, search_database
import asyncio
# Call tools like regular async functions
weather, results = await asyncio.gather(
get_weather("Tokyo", "Japan"),
search_database("hotels"),
)
print(f"Weather: {weather}")
print(f"Results: {results}")
Note: The code runs in a top-level async context, so you can use
awaitdirectly without defining an async function.
Server CLI Options
progtc serve [OPTIONS]
| Option | Default | Description |
|---|---|---|
--host |
127.0.0.1 |
Host to bind to |
--port |
8000 |
Port to bind to |
--api-key |
(env: PROGTC_API_KEY) |
API key for authentication |
--tool-call-timeout |
10.0 |
Timeout for individual tool calls (seconds) |
--code-execution-timeout |
30.0 |
Total timeout for code execution (seconds) |
Error Handling
The client returns a discriminated union—either success or one of several error types:
from progtc.types import MessageType
result = await client.execute_code(code, tools)
match result.type:
case MessageType.SUCCESS:
print(f"Stdout: {result.stdout}")
case MessageType.SYNTAX_ERROR:
print(f"Syntax error: {result.stderr}")
case MessageType.RUNTIME_ERROR:
print(f"Runtime error: {result.stderr}")
case MessageType.TIMEOUT_ERROR:
print(f"Timeout: {result.stderr}")
Example: Pydantic AI + E2B
See examples/e2b-example/ for a complete example using progtc with a pydantic-ai agent and an E2B sandbox.
Building AI agents? We're hiring: capsa.ai/careers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file progtc-0.1.15.tar.gz.
File metadata
- Download URL: progtc-0.1.15.tar.gz
- Upload date:
- Size: 97.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.9.16 {"installer":{"name":"uv","version":"0.9.16","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c878d6c5c102dc4fd3275848dbcf41a3b0294c9903b7e55bdffa6be5cae25bf9
|
|
| MD5 |
55cd3b58c3450a5884f175e92587b8a9
|
|
| BLAKE2b-256 |
252990374a10dd8ff1d532358cfb7abdd6064e3c51a3ca8bd4d618039a78977f
|
File details
Details for the file progtc-0.1.15-py3-none-any.whl.
File metadata
- Download URL: progtc-0.1.15-py3-none-any.whl
- Upload date:
- Size: 13.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.9.16 {"installer":{"name":"uv","version":"0.9.16","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3a44d929c87d3e60f2a9391484fa93596963cdae11d599ab9ba85be5daf17504
|
|
| MD5 |
37bbb27121dc5982fbdd3b0936692169
|
|
| BLAKE2b-256 |
8b4adf02470ad2c59ff47433f69ce142c1afd878936ea28423bd50a2336a78d4
|