Skip to main content

Tool calling made easy

Project description

ToolKitR

A lightweight Python library for creating and managing function tools that integrate with any LLM provider supporting function calling. ToolKitR provides type-safe function registration with automatic JSON Schema generation from Python type hints.

Features

  • Type-safe function tool registry system
  • Automatic JSON Schema generation from Python type annotations
  • Support for both synchronous and asynchronous tools
  • Custom response serialization for both success and error responses
  • Configurable exception handling
  • Support for complex Python types:
    • Enums, Dataclasses, TypedDicts, NamedTuples
    • Lists, Tuples, and Dictionaries
    • Optional and Union types
    • Literal types
    • Annotated types with descriptions

Installation

pip install toolkitr

Quick Start

from typing import Annotated
from toolkitr import ToolRegistry

# Create a registry
registry = ToolRegistry()

# Register a function as a tool
@registry.tool()
def get_weather(location: Annotated[str, "The location to get weather for"]) -> str:
    """Get the weather for a location."""
    return f"The weather in {location} is sunny."

# Get tool definitions for an LLM provider
tool_definitions = registry.definitions()

# Execute a tool directly
result = registry.call("get_weather", location="London")
print(result)  # "The weather in London is sunny."

# Execute a tool call from an LLM
tool_result = registry.tool_call({
    "id": "call_123",
    "type": "function",
    "function": {
        "name": "get_weather",
        "arguments": '{"location": "London"}'
    }
})

# Access components of the tool result
print(tool_result.result)       # The raw function result
print(tool_result.message)      # The formatted message for the LLM
print(tool_result.success)      # True if call succeeded, False if it raised an exception
print(tool_result.tool.name)    # The name of the tool that was called

Working with Async Tools

import asyncio

# Define an async tool
async def async_weather(location: str) -> str:
    await asyncio.sleep(0.1)  # Simulate API call
    return f"Weather in {location} is cloudy."

# Register and use it
registry.register_tool(async_weather)

# Use the unified interface for both sync and async tools
async def main():
    # Works with both sync and async functions
    sync_result = await registry.smart_call("get_weather", location="Paris")
    async_result = await registry.smart_call("async_weather", location="Tokyo")
    
    # Handle OpenAI-style tool calls
    tool_result = await registry.smart_tool_call({
        "id": "call_456",
        "type": "function",
        "function": {
            "name": "async_weather",
            "arguments": '{"location": "Berlin"}'
        }
    })
    
    print(tool_result.result)  # "Weather in Berlin is cloudy."

Error Handling

# Configure error handling
registry = ToolRegistry(
    # Custom exception serializer
    exception_serializer=lambda exc: json.dumps({
        "error": {
            "type": type(exc).__name__,
            "message": str(exc)
        }
    }),
    # Set to False to let exceptions propagate
    catch_exceptions=True
)

@registry.tool()
def risky_function(input: str) -> str:
    if input == "fail":
        raise ValueError("Intentional failure")
    return f"Success: {input}"

# When a tool raises an exception, tool_result has:
# - tool_result.error: The exception that was raised
# - tool_result.result: None
# - tool_result.success: False
# - tool_result.message: Contains the serialized error

OpenAI Integration

from openai import OpenAI

client = OpenAI()
messages = [
    {"role": "user", "content": "What's the weather in London?"}
]

# Create chat completion with tools
response = client.chat.completions.create(
    messages=messages,
    model="gpt-4",
    tools=registry.definitions()
)

# Handle tool calls
message = response.choices[0].message
messages.append(message)

for tool_call in message.tool_calls:
    # Get the tool result
    tool_result = registry.tool_call(tool_call.model_dump())
    # Add just the message to the conversation
    messages.append(tool_result.message)

Advanced Features

Custom Serializers

# Registry-level serializer
registry = ToolRegistry(response_serializer=lambda result: json.dumps(result, indent=2))

# Per-tool serializer
@registry.tool(response_serializer=lambda x: f'"Custom: {x}"')
def special_tool(input: str) -> str:
    return input.upper()

Human-friendly Tool Titles

@registry.tool(title="Get Weather Information")
def get_weather(location: str) -> str:
    """Get the weather for a location."""
    return f"The weather in {location} is sunny."

Strict Mode

# Enable strict mode to prevent additional parameters
registry = ToolRegistry(strict=True)

# Override for specific tools
@registry.tool(strict=False)
def flexible_tool(param: str) -> str:
    """This tool allows additional parameters."""
    return f"Got {param}"

Complex Types

from enum import Enum
from dataclasses import dataclass
from typing import Optional, Literal, TypedDict, NamedTuple

class Priority(Enum):
    LOW = "low"
    HIGH = "high"

@dataclass
class User:
    name: str
    age: int

class Options(TypedDict):
    tags: list[str]
    due_date: Optional[str]

class Point(NamedTuple):
    x: float
    y: float

@registry.tool()
def create_task(
    user: User,
    priority: Priority,
    location: Point,
    options: Options,
    status: Literal["pending", "done"] = "pending"
) -> str:
    """Create a new task."""
    return f"Created task for {user.name} with {priority.value} priority"

Limitations

When using tuples with LLM providers, prefer:

  • NamedTuple for fixed-length sequences with named fields
  • List for variable-length sequences
  • dataclass for structured data

License

MIT License. See LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toolkitr-0.3.0.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toolkitr-0.3.0-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file toolkitr-0.3.0.tar.gz.

File metadata

  • Download URL: toolkitr-0.3.0.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for toolkitr-0.3.0.tar.gz
Algorithm Hash digest
SHA256 8bfe0986d6ab6c8176f3f61bdeda77048272a839783b85c2e04058d2d9704841
MD5 5884a4df29f65bcc18330c8c2bcb83cf
BLAKE2b-256 be835be997275e4ff971e866a73dd79ecc6f5f0241fd35c79504b807af7799a5

See more details on using hashes here.

File details

Details for the file toolkitr-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: toolkitr-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 11.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for toolkitr-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 29d107a271845218c3a87c80eabe611f4b6aeec2170323e4982a6e69acacca01
MD5 c38529255cd08515e0fb2d6e02ba2276
BLAKE2b-256 f7e7935a6bb7f910905784a7fdbe1809f90906bedcbbef759f6b4519265c42a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page