A library for managing nodes and agent workflows
Project description
HLR (Hierarchical LLM Routing)
HLR is a flexible and easy-to-use library for managing hierarchical workflows based on nodes.
Each node represents a step in your workflow, executes a function, and decides which node to execute next.
This makes it ideal for applications requiring sequential task processing with dynamic routing and context sharing.
Features
- Hierarchical Flow: Organize your workflow into nodes that can branch out or converge based on custom logic.
- Dynamic Routing: Each node can either explicitly determine the next node by returning an ID, or let an LLM decide the next step based on node descriptions.
- Shared Context: Data (e.g., logs or variable values) is maintained in a common context object that nodes can read or update.
- LLM Integration: Easily integrate with language models like Google’s Gemini (gemini-2.0-flash) or OpenAI’s gpt-4o for decision making.
- Input Validation: The library ensures that all required fields are provided and unique node IDs are enforced.
Installation
Install HLR from PyPI:
pip install hlr_agent
Note: Package names are normalized by PyPI. Although the package is defined as
hlr_agent, you can install it using eitherhlr_agentorhlr-agent.
Usage
Below is an example of how to set up and run your hierarchical workflow.
Define Node Functions
Each node function takes the agent as a parameter (so it can update the context or decide the next node explicitly). For example:
from hlr_agent import Node, Agent
def func_input(agent):
print("AGENT IN INPUT NODE")
agent.context["context"] = "- Input node executed.\n"
def func_database(agent):
print("AGENT IN DATABASE NODE")
agent.context["context"] += "- Database node executed.\n"
def func_files(agent):
print("AGENT IN FILES NODE")
agent.context["context"] += "- Files node executed.\n"
def func_mailing(agent):
print("AGENT IN MAILING NODE")
agent.context["context"] += "- Mailing node executed.\n"
def func_output(agent):
print("AGENT IN OUTPUT NODE")
print("LOGS:\n" + agent.context["context"])
Configure Nodes
Nodes are created by providing a unique node ID, a list of candidate child node IDs, an optional function to execute, and an optional description. For example:
nodes = [
Node("Input", children=["Database", "Files", "Mailing", "None"], func=func_input),
Node("Database", children=["Output"], func=func_database, description="Select this if the user wants to use a database"),
Node("Files", children=["Output"], func=func_files, description="Select this if the user wants to use a file"),
Node("Mailing", children=["Output"], func=func_mailing, description="Select this if the user wants to use a mailing related functionality."),
Node("None", children=["Output"], func=None, description="Select this node if the rest of the nodes are not valid for the request"),
Node("Output", children=None, func=func_output),
]
Initialize and Run the Agent
The Agent class handles the execution flow. It validates all required parameters and ensures that node IDs are unique.
It also integrates with an LLM for dynamic routing when a node does not explicitly return the next node's ID.
# Example user message that the LLM can consider when deciding the next node.
user_message = "I want to send a message to my boss"
agent = Agent(
nodes=nodes,
start_node_id="Input",
end_node_id="Output",
model="gemini-2.0-flash",
api_key="YOUR_API_KEY_HERE", # Replace with your API key
user_message=user_message
)
agent.run()
How It Works
-
Initialization:
Upon creating anAgent, mandatory fields are validated and an internal map of nodes is built.
Duplicate node IDs or missing parameters raise exceptions immediately. -
Execution Flow:
The agent starts at the givenstart_node_idand executes the corresponding node’s function.
The node function may return the next node’s ID explicitly, or the agent will consult the LLM using the node descriptions and an optional shared context. -
Dynamic Routing:
If more than one child node is available and their descriptions are valid, the LLM (viaget_next_node) will determine the most relevant child node, based on the user message and any extra context provided. -
Termination:
The workflow finishes when a node with no children is reached, or the agent is directed to theend_node_id, which may itself execute a termination function.
Contributing
Contributions are welcome! Feel free to open issues or submit pull requests for improvements and bug fixes.
License
This project is licensed under Creative Commons Attribution-NonCommercial 4.0 International License
Contact
For questions or issues, please contact davidsd.2704@gmail.com
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hlr_agent-0.2.2.tar.gz.
File metadata
- Download URL: hlr_agent-0.2.2.tar.gz
- Upload date:
- Size: 6.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
190d3d6410604c8f2070d2681cf3aa468434c1cd24f4a8202a81b4d630b7f099
|
|
| MD5 |
1cdbe12bdc6e654b5b30b70320d0e9a2
|
|
| BLAKE2b-256 |
8e0c6a3235bf7e12fc458c25863031167a62bf65541286b9f97d07273adc4bea
|
File details
Details for the file hlr_agent-0.2.2-py3-none-any.whl.
File metadata
- Download URL: hlr_agent-0.2.2-py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40d3641f7613012f871fe36bda4215f24ea3b183e776541d4056de8a6f3b3946
|
|
| MD5 |
74cf0c59a8f66a55af17d2044050203f
|
|
| BLAKE2b-256 |
55d6536ea038c890b6daabcf6f5c8e51458eaff6cc0d802181492c453be927cb
|