Skip to main content

LLM Agent Framework with tool execution capabilities

Project description

Acton Agent

⚠️ Experimental Project: This is a personal project currently in an experimental phase. The API may change without notice, and features may be incomplete or unstable. Use at your own discretion.

Acton Agent is a lightweight, flexible LLM Agent Framework with tool execution capabilities. It enables you to build AI agents that can interact with external APIs, execute custom Python functions, and maintain conversation context - all with minimal configuration.

Installation

Basic Installation

pip install acton-agent

Installation with Optional Dependencies

For OpenAI integration:

pip install acton-agent[openai]

For development (includes testing and linting tools):

pip install acton-agent[dev]

Install all optional dependencies:

pip install acton-agent[all]

Requirements

  • Python >= 3.8
  • Core dependencies:
    • pydantic >= 2.0.0
    • tenacity >= 8.0.0
    • loguru >= 0.7.0
    • requests >= 2.31.0

Usage Examples

Example 1: Requests Tool Usage

The RequestsTool allows your agent to make HTTP API calls. Here's an example using the JSONPlaceholder API:

from acton_agent import Agent
from acton_agent.client import OpenAIClient
from acton_agent.tools import RequestsTool

# Initialize the OpenAI client
client = OpenAIClient(
    api_key="your-openai-api-key",
    model="gpt-4o"
)

# Create an agent
agent = Agent(
    llm_client=client,
    system_prompt="You are a helpful assistant that can fetch data from APIs."
)

# Create a RequestsTool for fetching posts from JSONPlaceholder
posts_tool = RequestsTool(
    name="get_posts",
    description="Fetch posts from JSONPlaceholder API",
    method="GET",
    url_template="https://jsonplaceholder.typicode.com/posts",
    query_params_schema={
        "userId": {
            "type": "number",
            "description": "Filter posts by user ID",
            "required": False
        }
    }
)

# Register the tool with the agent
agent.register_tool(posts_tool)

# Run the agent with a query
result = agent.run("Get me the posts from user ID 1")
print(result)

You can also use the convenient create_api_tool helper:

from acton_agent.tools import create_api_tool

# Create a tool for fetching a specific post
post_tool = create_api_tool(
    name="get_post",
    description="Fetch a specific post by ID",
    endpoint="https://jsonplaceholder.typicode.com/posts/{post_id}",
    method="GET"
)

# Note: Path parameters are automatically extracted from the URL template
agent.register_tool(post_tool)
result = agent.run("Get me post number 5")

Example 2: Function Tool Agent

The FunctionTool allows you to wrap Python functions and expose them to your agent:

from acton_agent import Agent
from acton_agent.client import OpenAIClient
from acton_agent.agent import FunctionTool

# Initialize the client
client = OpenAIClient(
    api_key="your-openai-api-key",
    model="gpt-4o"
)

# Create an agent
agent = Agent(
    llm_client=client,
    system_prompt="You are a helpful assistant with calculator capabilities."
)

# Define a Python function
def calculate(a: float, b: float, operation: str) -> float:
    """Perform basic arithmetic operations."""
    if operation == "add":
        return a + b
    elif operation == "subtract":
        return a - b
    elif operation == "multiply":
        return a * b
    elif operation == "divide":
        if b == 0:
            raise ValueError("Cannot divide by zero")
        return a / b
    else:
        raise ValueError(f"Unknown operation: {operation}")

# Define the schema for the function
calculator_schema = {
    "type": "object",
    "properties": {
        "a": {
            "type": "number",
            "description": "First number"
        },
        "b": {
            "type": "number",
            "description": "Second number"
        },
        "operation": {
            "type": "string",
            "description": "Operation to perform",
            "enum": ["add", "subtract", "multiply", "divide"]
        }
    },
    "required": ["a", "b", "operation"]
}

# Create a FunctionTool
calculator_tool = FunctionTool(
    name="calculator",
    description="Perform basic arithmetic operations",
    func=calculate,
    schema=calculator_schema
)

# Register the tool with the agent
agent.register_tool(calculator_tool)

# Run the agent with queries
result = agent.run("What is 25 multiplied by 4?")
print(result)

result = agent.run("Calculate 100 divided by 5, then add 10 to the result")
print(result)

You can also create custom tools by subclassing the Tool class:

from acton_agent.agent import Tool

class WeatherTool(Tool):
    """Custom tool for getting weather information."""

    def __init__(self):
        super().__init__(
            name="get_weather",
            description="Get current weather for a city"
        )

    def execute(self, parameters: dict) -> str:
        """Execute the tool with the given parameters."""
        city = parameters.get("city", "Unknown")
        # In a real implementation, you would call a weather API here
        return f"The weather in {city} is sunny and 72°F"

    def get_schema(self) -> dict:
        """Return the JSON schema for the tool parameters."""
        return {
            "type": "object",
            "properties": {
                "city": {
                    "type": "string",
                    "description": "Name of the city"
                }
            },
            "required": ["city"]
        }

# Use the custom tool
weather_tool = WeatherTool()
agent.register_tool(weather_tool)
result = agent.run("What's the weather in San Francisco?")

Example 3: Streaming Responses

You can stream responses from the agent in real-time:

from acton_agent import Agent
from acton_agent.client import OpenAIClient
from acton_agent.tools import RequestsTool

# Initialize the client
client = OpenAIClient(
    api_key="your-openai-api-key",
    model="gpt-4o"
)

# Create an agent with streaming enabled
agent = Agent(
    llm_client=client,
    system_prompt="You are a helpful assistant.",
    stream=True
)

# Add a tool (optional)
posts_tool = RequestsTool(
    name="get_posts",
    description="Fetch posts from JSONPlaceholder API",
    method="GET",
    url_template="https://jsonplaceholder.typicode.com/posts"
)
agent.register_tool(posts_tool)

# Stream the response
for event in agent.run_stream("Tell me about post number 1"):
    if event.get("type") == "content":
        print(event.get("data"), end="", flush=True)
    elif event.get("type") == "tool_call":
        print(f"\n[Calling tool: {event.get('tool_name')}]\n")
    elif event.get("type") == "tool_result":
        print(f"\n[Tool result received]\n")
print()  # Final newline

More Examples

For complete, runnable examples, check out the examples directory:

API Documentation

For detailed API documentation, please refer to the docstrings in the source code or visit our GitHub repository.

Additional Information

Current Status

This project is in experimental phase and is primarily for personal use. The following should be considered:

  • API Stability: The API may change between versions without notice
  • Production Readiness: Not recommended for production use yet
  • Documentation: Documentation is being actively developed
  • Testing: Test coverage is being expanded

Known Limitations

  • Limited to text-based interactions (no multimodal support yet)
  • Tool execution is synchronous (no async support yet)
  • Limited error recovery strategies for complex tool chains
  • No built-in conversation persistence

Planned Features

  • Asynchronous tool execution
  • Multimodal support (images, audio)
  • Built-in conversation persistence and memory
  • More pre-built tools for common tasks
  • Better error handling and recovery
  • Support for more LLM providers
  • Tool composition and chaining utilities
  • Improved streaming capabilities
  • Plugin system for extensions

Contributing

As this is a personal experimental project, contributions are not actively sought at this time. However, if you find bugs or have suggestions, feel free to open an issue on GitHub.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Support

For questions or issues, please use the GitHub Issues page.


Disclaimer: This is an experimental personal project. Use it at your own risk. The author makes no guarantees about stability, security, or fitness for any particular purpose.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acton_agent-0.0.7.tar.gz (69.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

acton_agent-0.0.7-py3-none-any.whl (45.7 kB view details)

Uploaded Python 3

File details

Details for the file acton_agent-0.0.7.tar.gz.

File metadata

  • Download URL: acton_agent-0.0.7.tar.gz
  • Upload date:
  • Size: 69.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acton_agent-0.0.7.tar.gz
Algorithm Hash digest
SHA256 9295397be81755800160e54d4a9381827e4c912c7473c5021a61bddbf4797339
MD5 c21493df1377426f6304845c4a8d901c
BLAKE2b-256 4e17302e4bc8e3d9471a60f4050c2f5f671a038ba7709e1fc9d5ae34d7a63568

See more details on using hashes here.

Provenance

The following attestation bundles were made for acton_agent-0.0.7.tar.gz:

Publisher: python-publish.yml on akstspace/acton-agent

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file acton_agent-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: acton_agent-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 45.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acton_agent-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 9bd8102945a249bc74e7a13d3ee77d64267353ef39bff73a731959a3f879f47d
MD5 4b5adab7a636ebf3934894102dc2ce0e
BLAKE2b-256 b5f77991bf8d539668e0afd2c0718b99f96af246269c42efbd8809cf763aaf0b

See more details on using hashes here.

Provenance

The following attestation bundles were made for acton_agent-0.0.7-py3-none-any.whl:

Publisher: python-publish.yml on akstspace/acton-agent

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page