Skip to main content

Global and per-tool call limit enforcement middleware for AI agent pipelines

Project description

🛠️ AzureAICommunity - Agent - Tool Limit Middleware

Prevent runaway tool calls by enforcing global and per-tool call limits across every AI agent completion.

License Python GitHub Repo GitHub Follow YouTube Channel LinkedIn

Getting Started · Per-Tool Limits · Inspect Usage · How It Works


Overview

azureaicommunity-agent-tool-limit is a lightweight guard layer for AI agent pipelines built on agent-framework. During each completion it tracks every function_call content item emitted by the model and silently suppresses any calls that breach a configurable global cap or an optional per-tool cap. When calls are suppressed the optional on_limit_exceeded callback is invoked.

The middleware does not raise an exception — it drops the over-limit calls so the agent loop can continue cleanly, mirroring the behaviour of the .NET ToolLimitMiddleware.


✨ Features

Feature
🔢 Global call cap — limits the total number of tool invocations per session
🔧 Per-tool limits — set independent ceilings for individual tool names
🔀 Streaming support — works with both non-streaming and streaming responses
🤫 Silent suppression — over-limit calls are removed; no exception is thrown
📣 on_limit_exceeded callback — sync or async callback invoked when calls are blocked
📊 Usage introspectionget_current_usage() returns attempted vs allowed counts per tool
🔄 Resettablereset() clears counters for a fresh session

📦 Installation

pip install azureaicommunity-agent-tool-limit

Or install directly from source:

cd AgentFramework/Python/Middleware/ToolLimitMiddleware
pip install -e .

🚀 Quick Start

import asyncio
from agent_framework import tool
from agent_framework.ollama import OllamaChatClient
from tool_limit_middleware import ToolLimitMiddleware, ToolLimits


@tool
def get_weather(location: str) -> str:
    return f"The weather in {location} is sunny with a high of 22°C."


async def main():
    client = OllamaChatClient(model="llama3.2")

    middleware = ToolLimitMiddleware(
        limits=ToolLimits(global_max=5),
    )

    agent = client.as_agent(
        name="WeatherAgent",
        instructions="You are a helpful assistant with a weather tool.",
        tools=[get_weather],
        middleware=[middleware],
    )

    response = await agent.run("What is the weather in Amsterdam?")
    print(response.text)


asyncio.run(main())

🔧 Per-Tool Limits

In addition to the global cap, restrict individual tools independently:

middleware = ToolLimitMiddleware(
    limits=ToolLimits(
        global_max=10,
        per_tool_max={
            "get_weather":  3,
            "search_videos": 2,
        },
    ),
    on_limit_exceeded=lambda info: print("Limit hit:", info),
)

Any call to get_weather beyond 3, or to search_videos beyond 2, is silently removed — even if the global limit has not been reached.


📊 Inspect Usage

usage = middleware.get_current_usage()

print(f"Total allowed calls: {usage.total_calls} / {usage.global_limit}")

for tool_name, attempted in usage.per_tool.items():
    allowed = usage.per_tool_allowed.get(tool_name, 0)
    limit   = usage.per_tool_limits.get(tool_name)
    limit_text = f" / {limit}" if limit is not None else ""
    print(f"  {tool_name}: attempted={attempted}  allowed={allowed}{limit_text}")

# Reset for a new session
middleware.reset()

For streaming responses, a stream_transform_hook intercepts each ChatResponseUpdate as it arrives and removes over-limit function calls in real time. The on_limit_exceeded callback is fired once via a stream_result_hook after the stream completes.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azureaicommunity_agent_tool_limit-0.1.0.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file azureaicommunity_agent_tool_limit-0.1.0.tar.gz.

File metadata

File hashes

Hashes for azureaicommunity_agent_tool_limit-0.1.0.tar.gz
Algorithm Hash digest
SHA256 3288bfb79bd0b031dcd63b38088a52b0651ea9048c3f99028715ec799851ac7f
MD5 13ce51b40a7f53361bbd1a93f84c0192
BLAKE2b-256 688ca4fda2d17e07d6c4c9f6165fc7a5e5e2a3dcf00c8cd8a603ea465b723b0e

See more details on using hashes here.

File details

Details for the file azureaicommunity_agent_tool_limit-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for azureaicommunity_agent_tool_limit-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9b9d2279f96acb0605a85a490b2b88d71d5446c4a4a19bced1c2edd35d93c875
MD5 d8d3e41ba5f78210e34f263cf9d83f6c
BLAKE2b-256 9a71aee0dfc59a883ecfbf1663cf8aa899ca6481c4b96348f41665e2f59e8daa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page