LLM Agent Framework with tool execution capabilities
Project description
Acton Agent
⚠️ Experimental Project: This is a personal project currently in an experimental phase. The API may change without notice, and features may be incomplete or unstable. Use at your own discretion.
Acton Agent is a lightweight, flexible LLM Agent Framework with tool execution capabilities. It enables you to build AI agents that can interact with external APIs, execute custom Python functions, and maintain conversation context - all with minimal configuration.
Installation
Basic Installation
pip install acton-agent
Installation with Optional Dependencies
For OpenAI integration:
pip install acton-agent[openai]
For development (includes testing and linting tools):
pip install acton-agent[dev]
Install all optional dependencies:
pip install acton-agent[all]
Requirements
- Python >= 3.8
- Core dependencies:
pydantic >= 2.0.0tenacity >= 8.0.0loguru >= 0.7.0requests >= 2.31.0
Usage Examples
Example 1: Requests Tool Usage
The RequestsTool allows your agent to make HTTP API calls. Here's an example using the JSONPlaceholder API:
from acton_agent import Agent
from acton_agent.client import OpenAIClient
from acton_agent.tools import RequestsTool
# Initialize the OpenAI client
client = OpenAIClient(
api_key="your-openai-api-key",
model="gpt-4o"
)
# Create an agent
agent = Agent(
llm_client=client,
system_prompt="You are a helpful assistant that can fetch data from APIs."
)
# Create a RequestsTool for fetching posts from JSONPlaceholder
posts_tool = RequestsTool(
name="get_posts",
description="Fetch posts from JSONPlaceholder API",
method="GET",
url_template="https://jsonplaceholder.typicode.com/posts",
query_params_schema={
"userId": {
"type": "number",
"description": "Filter posts by user ID",
"required": False
}
}
)
# Register the tool with the agent
agent.register_tool(posts_tool)
# Run the agent with a query
result = agent.run("Get me the posts from user ID 1")
print(result)
You can also use the convenient create_api_tool helper:
from acton_agent.tools import create_api_tool
# Create a tool for fetching a specific post
post_tool = create_api_tool(
name="get_post",
description="Fetch a specific post by ID",
endpoint="https://jsonplaceholder.typicode.com/posts/{post_id}",
method="GET"
)
# Note: Path parameters are automatically extracted from the URL template
agent.register_tool(post_tool)
result = agent.run("Get me post number 5")
Example 2: Function Tool Agent
The FunctionTool allows you to wrap Python functions and expose them to your agent:
from acton_agent import Agent
from acton_agent.client import OpenAIClient
from acton_agent.agent import FunctionTool
# Initialize the client
client = OpenAIClient(
api_key="your-openai-api-key",
model="gpt-4o"
)
# Create an agent
agent = Agent(
llm_client=client,
system_prompt="You are a helpful assistant with calculator capabilities."
)
# Define a Python function
def calculate(a: float, b: float, operation: str) -> float:
"""Perform basic arithmetic operations."""
if operation == "add":
return a + b
elif operation == "subtract":
return a - b
elif operation == "multiply":
return a * b
elif operation == "divide":
if b == 0:
raise ValueError("Cannot divide by zero")
return a / b
else:
raise ValueError(f"Unknown operation: {operation}")
# Define the schema for the function
calculator_schema = {
"type": "object",
"properties": {
"a": {
"type": "number",
"description": "First number"
},
"b": {
"type": "number",
"description": "Second number"
},
"operation": {
"type": "string",
"description": "Operation to perform",
"enum": ["add", "subtract", "multiply", "divide"]
}
},
"required": ["a", "b", "operation"]
}
# Create a FunctionTool
calculator_tool = FunctionTool(
name="calculator",
description="Perform basic arithmetic operations",
func=calculate,
schema=calculator_schema
)
# Register the tool with the agent
agent.register_tool(calculator_tool)
# Run the agent with queries
result = agent.run("What is 25 multiplied by 4?")
print(result)
result = agent.run("Calculate 100 divided by 5, then add 10 to the result")
print(result)
You can also create custom tools by subclassing the Tool class:
from acton_agent.agent import Tool
class WeatherTool(Tool):
"""Custom tool for getting weather information."""
def __init__(self):
super().__init__(
name="get_weather",
description="Get current weather for a city"
)
def execute(self, parameters: dict) -> str:
"""Execute the tool with the given parameters."""
city = parameters.get("city", "Unknown")
# In a real implementation, you would call a weather API here
return f"The weather in {city} is sunny and 72°F"
def get_schema(self) -> dict:
"""Return the JSON schema for the tool parameters."""
return {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "Name of the city"
}
},
"required": ["city"]
}
# Use the custom tool
weather_tool = WeatherTool()
agent.register_tool(weather_tool)
result = agent.run("What's the weather in San Francisco?")
Example 3: Streaming Responses
You can stream responses from the agent in real-time:
from acton_agent import Agent
from acton_agent.client import OpenAIClient
from acton_agent.tools import RequestsTool
# Initialize the client
client = OpenAIClient(
api_key="your-openai-api-key",
model="gpt-4o"
)
# Create an agent with streaming enabled
agent = Agent(
llm_client=client,
system_prompt="You are a helpful assistant.",
stream=True
)
# Add a tool (optional)
posts_tool = RequestsTool(
name="get_posts",
description="Fetch posts from JSONPlaceholder API",
method="GET",
url_template="https://jsonplaceholder.typicode.com/posts"
)
agent.register_tool(posts_tool)
# Stream the response
for event in agent.run_stream("Tell me about post number 1"):
if event.get("type") == "content":
print(event.get("data"), end="", flush=True)
elif event.get("type") == "tool_call":
print(f"\n[Calling tool: {event.get('tool_name')}]\n")
elif event.get("type") == "tool_result":
print(f"\n[Tool result received]\n")
print() # Final newline
More Examples
For complete, runnable examples, check out the examples directory:
- examples/requests_tool_example.py - API integration with RequestsTool
- examples/function_tool_example.py - Custom Python function tools
- examples/streaming_example.py - Real-time streaming responses
- examples/custom_tool_example.py - Building custom tool classes
API Documentation
For detailed API documentation, please refer to the docstrings in the source code or visit our GitHub repository.
Additional Information
Current Status
This project is in experimental phase and is primarily for personal use. The following should be considered:
- API Stability: The API may change between versions without notice
- Production Readiness: Not recommended for production use yet
- Documentation: Documentation is being actively developed
- Testing: Test coverage is being expanded
Known Limitations
- Limited to text-based interactions (no multimodal support yet)
- Tool execution is synchronous (no async support yet)
- Limited error recovery strategies for complex tool chains
- No built-in conversation persistence
Planned Features
- Asynchronous tool execution
- Multimodal support (images, audio)
- Built-in conversation persistence and memory
- More pre-built tools for common tasks
- Better error handling and recovery
- Support for more LLM providers
- Tool composition and chaining utilities
- Improved streaming capabilities
- Plugin system for extensions
Contributing
As this is a personal experimental project, contributions are not actively sought at this time. However, if you find bugs or have suggestions, feel free to open an issue on GitHub.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Support
For questions or issues, please use the GitHub Issues page.
Disclaimer: This is an experimental personal project. Use it at your own risk. The author makes no guarantees about stability, security, or fitness for any particular purpose.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file acton_agent-0.0.9.tar.gz.
File metadata
- Download URL: acton_agent-0.0.9.tar.gz
- Upload date:
- Size: 67.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
957f52bd428185c3d38f60d522aabfa422e2efe44ae25aca7154f22569af9bb5
|
|
| MD5 |
39c2019db81f26d586494de0fc7423d5
|
|
| BLAKE2b-256 |
d8d507ba664b72241df6c0ec8d8fdba4ff6e3dd29bdc354d888d403d1476514b
|
Provenance
The following attestation bundles were made for acton_agent-0.0.9.tar.gz:
Publisher:
python-publish.yml on akstspace/acton-agent
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
acton_agent-0.0.9.tar.gz -
Subject digest:
957f52bd428185c3d38f60d522aabfa422e2efe44ae25aca7154f22569af9bb5 - Sigstore transparency entry: 768520875
- Sigstore integration time:
-
Permalink:
akstspace/acton-agent@38948d5e208e7be93560a6408f9bd00d725657e7 -
Branch / Tag:
refs/tags/v0.0.9 - Owner: https://github.com/akstspace
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@38948d5e208e7be93560a6408f9bd00d725657e7 -
Trigger Event:
release
-
Statement type:
File details
Details for the file acton_agent-0.0.9-py3-none-any.whl.
File metadata
- Download URL: acton_agent-0.0.9-py3-none-any.whl
- Upload date:
- Size: 44.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2004b42c63a0cdb46c5dd8bdb69757638ac561cba21ca9e4cbaa2c6b4970b739
|
|
| MD5 |
453c47bf8730b839c6a67638e53c1c0b
|
|
| BLAKE2b-256 |
6b7cf2bba6c0fab9433c249593cffda231d23ab9cf83e4ea030b401e47ba56a4
|
Provenance
The following attestation bundles were made for acton_agent-0.0.9-py3-none-any.whl:
Publisher:
python-publish.yml on akstspace/acton-agent
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
acton_agent-0.0.9-py3-none-any.whl -
Subject digest:
2004b42c63a0cdb46c5dd8bdb69757638ac561cba21ca9e4cbaa2c6b4970b739 - Sigstore transparency entry: 768520878
- Sigstore integration time:
-
Permalink:
akstspace/acton-agent@38948d5e208e7be93560a6408f9bd00d725657e7 -
Branch / Tag:
refs/tags/v0.0.9 - Owner: https://github.com/akstspace
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@38948d5e208e7be93560a6408f9bd00d725657e7 -
Trigger Event:
release
-
Statement type: