A Python library for building AI agents.
Project description
Agentron
Agentron is a modular Python toolkit for building AI agents. Its features include:
- Support for most major providers (OpenAI, Anthropic, Google, OpenRouter, ...), including certain subscription plans like ChatGPT Plus/Pro
- Defining custom tools for agents using plain Python functions and types
- Session persistence
- A web-based agent activity viewer (with streaming event support)
- Automatic model metadata discovery via services like models.dev
- A collection of built-in tools that you can mix and match
Installation
pip install agentron
Requirements
- Python 3.12 or newer
- Node.js 20.19 or newer
Example
import asyncio
from agentron import make_agent
# Tools are regular Python functions (may be async).
# Agentron parses and validates the type annotations and
# docstrings to generate LLM-compatible tool schemas.
def get_current_city() -> str:
"""
Returns the name of the user's current city.
"""
return 'San Francisco'
def get_calvinball_team_name(city: str) -> str:
"""
Returns the name of the local Calvinball team in the specified city.
Args:
city: The name of the city to get the team name for.
"""
return f'{city} Sprockets'
async def main():
agent = make_agent(
system_prompt="You are a helpful assistant. Use the available tools to answer the user's question.",
tools=[
get_current_city,
get_calvinball_team_name,
],
# The latest model details are auto-fetched.
# The API key (if not explicitly passed in here) is automatically resolved
# from environment vars or ~/.agentron/auth.json
model='openai:gpt-5.4',
# Display agent activity in the terminal
terminal=True,
)
response = await agent.ask('What is the name of the local Calvinball team in my city?')
print('Agent response:', response)
asyncio.run(main())
Models
Models can be accessed in multiple ways:
- As
<provider>:<model name>strings (for example,openai:gpt-5.4) when using convenience functions likemake_agent - Via the
get_modelfunction - By manually instantiating a
Modelinstance, potentially viamake_model
By default, Agentron uses online model metadata providers to resolve a model's details, such as its endpoint and context window. These currently include:
Authentication
You can provide authentication details in several ways:
-
Pass an API key explicitly using the
api_keyargument tomake_agent(generally not recommended) -
Set a provider-specific environment variable (for example,
ANTHROPIC_API_KEY) -
Add credentials to
~/.agentron/auth.json:{ "zai-coding-plan": "<api key goes here>", "openai": "<api key goes here>", "openai:gpt-5.4": "<model-scoped api key goes here>" }
For subscription plans like ChatGPT Pro/Plus, Agentron needs to acquire an OAuth token. Use the built-in interactive login utility:
agentron login
Tools
Defining Tools
Agentron supports regular Python functions, callables, and types for defining tools:
def my_custom_tool(arg_1: str, arg_2: int = 42) -> str:
"""
A description for my custom tool.
Args:
arg_1: This is a description of the first argument.
arg_2: This is a description of the second argument.
It may span multiple lines.
"""
...
As shown above, for a Python function to be used as a tool, it must:
- Specify type annotations for all arguments
- Use Google-style docstrings to provide a function description and descriptions for each argument under the
Argssection (validated at runtime by Agentron)
Agentron automatically parses the function above to generate an LLM-compatible tool schema like the following:
{
"name": "my_custom_tool",
"description": "A description for my custom tool.",
"parameters": {
"type": "object",
"properties": {
"arg_1": {
"type": "string",
"description": "This is a description of the first argument."
},
"arg_2": {
"type": "integer",
"description": "This is a description for the second argument. It may span multiple lines."
}
},
"required": ["arg_1"]
}
}
The following are supported:
- Python primitive types (
int,float,str,bool,None) - Unions (for example,
str | None) list(for example,list[str]) anddict(for example,dict[str, int])TypedDictdataclass- Callable class instances
- Partial functions
Built-in Tools
Agentron includes a collection of built-in tools under the agentron.kit submodule. These currently include:
- Filesystem I/O:
read_file,write_file,apply_patch - Shell calls:
bash,git,grep - A stateful Python REPL for agents
The built-in agentron code command provides a minimal coding agent implementation built using these tools.
Persistence
Specifying the output argument causes Agentron to persist session events (metadata, messages, ...) as JSONL files, written as events complete:
agent = make_agent(
# If this path points to an existing directory, session events will
# automatically be written to a file under it named <session_id>.jsonl.
# Otherwise, the path is treated as the target JSONL file.
output="/path/to/output",
...
)
For more details, see serialization.py.
Web UI
Agentron includes a local web-based UI for observing agent activity and viewing past sessions.
To monitor a session and launch the web server:
from agentron.web import serve
agent = make_agent(...)
# Launch the web server (opens a browser by default).
# Multiple agents may be specified.
with serve(agent):
# Perform tasks with agent
...
To view previously saved sessions, use the web command:
agentron web <path to .jsonl or a directory containing one or more .jsonl files>
Agent Events
The Agent instance exposes a set of publishers that trigger whenever certain events occur:
on_new_message: Published whenever a new message (user, assistant, ...) is added to the sessionon_streaming_message: Published as a new assistant response streams inon_tool_call: Published before executing a tool call (see also: tool manager)
These events are used by components like the Web UI and automatic persistence. You can observe them like this:
from agentron.types.message import AgentMessage
def handle_new_message(msg: AgentMessage) -> None:
...
# Start receiving new message events
unsubscribe = agent.on_new_message.subscribe(handle_new_message)
...
# Stop receiving new message events
unsubscribe()
Technical Notes
Communication Backend
There are currently multiple APIs across providers. For example, OpenAI has both its legacy API and the newer Responses API. Anthropic has its own API. Other providers may claim compatibility with existing APIs such as OpenAI's, but still differ in subtle ways.
Several libraries attempt to abstract over these differences and expose a unified interface. Agentron uses pi-ai for this purpose.
Agentron lazily spawns a lightweight, process-wide Node.js RPC helper (flux) to communicate with LLMs via the pi-ai translation layer, which eventually delegates to provider-specific JavaScript SDKs. IPC occurs over Unix domain sockets.
This process is automatically torn down when the parent Python process exits.
License
Provided under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentron-0.1.0-py3-none-any.whl.
File metadata
- Download URL: agentron-0.1.0-py3-none-any.whl
- Upload date:
- Size: 3.3 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dc738d5d295a411418eae26cd09f1e54d3df4c7c092bb47438ab724ed9046490
|
|
| MD5 |
3daae27a1ffa48d4478c3da63fa5a041
|
|
| BLAKE2b-256 |
b34761c52364985f200c3af78f5f1f5dd8bf73f370526529923b6aea8845e6fa
|
Provenance
The following attestation bundles were made for agentron-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on ethereon/agentron
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agentron-0.1.0-py3-none-any.whl -
Subject digest:
dc738d5d295a411418eae26cd09f1e54d3df4c7c092bb47438ab724ed9046490 - Sigstore transparency entry: 1259383892
- Sigstore integration time:
-
Permalink:
ethereon/agentron@ae3ed881585a9a7fa40ba49d5c0680808ce18ea7 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/ethereon
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ae3ed881585a9a7fa40ba49d5c0680808ce18ea7 -
Trigger Event:
push
-
Statement type: