Skip to main content

Athon is the agentic-python library to create the HPE Agentic Tool Mesh platform services. This is an innovative platform designed to streamline and enhance the use of AI in various applications. It serves as a central hub to orchestrate 'Intelligent Plugins,' optimizing AI interactions and processes.

Project description

LLM Agentic Tool Mesh

Welcome to LLM Agentic Tool Mesh, a pioneering initiative by HPE Athonet aimed at democratizing Generative Artificial Intelligence (Gen AI). Our vision is to make Gen AI accessible and beneficial to a broader audience, enabling users from various backgrounds to leverage cutting-edge Gen AI technology effortlessly.

Understanding the Challenges

Gen AI has the potential to revolutionize businesses, but adopting it comes with challenges:

  • Technical Complexity: Gen AI tools are powerful but often require both coding and machine learning expertise. This makes it difficult for companies to use these tools effectively without specialized skills.
  • Organizational Challenges: Simply adding a Gen AI team isn’t enough. The real value comes from using the knowledge of your existing teams, especially those who may not be tech experts. However, if not done right, Gen AI can impact team dynamics. It’s important to find ways to use Gen AI that enhance collaboration and make the most of everyone’s expertise.

Our Approach

LLM Agentic Tool Mesh empowers users to create tools and web applications using Gen AI with Low or No Coding. This approach addresses the technical challenges by simplifying the integration process. By leveraging the Pareto principle, LLM Agentic Tool Mesh focuses on the 20% of features that cover 80% of user needs. This is achieved by abstracting complex, low-level libraries into easy-to-understand services that are accessible even to non-developers, effectively hiding the underlying complexity.

This simplicity not only helps technical teams but also enables non-technical teams to develop tools related to their domain expertise. The platform then allows for the creation of a "Mesh" of these Gen AI tools, providing orchestration capabilities through an agentic Reasoning Engine based on Large Language Models (LLMs). This orchestration ensures that all tools work together seamlessly, enhancing overall functionality and efficiency across the organization.

Quick Start

We have created a series of tools and examples to demonstrate what you can do with LLM Agentic Tool Mesh. To get started, follow these steps to set up your environment, understand the project structure, and run the tools and web applications provided.

Folder Structure

The project is organized into the following directories:

  • src: Sourve code
    • lib: Contains athon the agentic-python library with all self-serve platform services for creating tools and web applications. These services are grouped into:
      • Chat Services
      • RAG (Retrieval-Augmented Generation) Services
      • Agent Services
      • System Platform Services
    • platform: Includes the agentic tool mesh with examples of Gen AI applications that demonstrate various capabilities:
      • Tool Examples: Demonstrates how to call an API, improve text, generate code, retrieve information from documents using RAG, and use a multi-agent system to solve complex tasks.
      • Web Applications:
        • A chatbot that orchestrates all these tools.
        • An agentic memory for sharing chat messages among different users.
        • A back panel that allows configuring a tool via a user interface.
    • notebooks: Contains interactive Jupyter notebooks to explore LLM Agentic Tool Mesh functionalities:
      • Platform Seervices: Notebooks to try Chat, RAG, and Agent services.
      • Meta-Prompting: Notebooks for creating an eCustomer Support Service agent using meta-prompting.
  • policies: Contains a set of governance policies and standards to ensure consistency, ethical adherence, and quality across all tools.

Prerequisites

Before setting up the LLM Agentic Tool Mesh platform, please ensure the following prerequisites are met:

General Requirements

  • API Key: Set your ChatGPT API key by assigning it to the OPENAI_API_KEY environment variable.

  • Python 3.11: Ensure Python 3.11 is installed on your machine.

    • It's recommended to install uv, a drop-in replacement for pip, venv, and other Python tooling.
    • You can install uv either via script or with pip:

    Option 1: Install via script (macOS/Linux)

    curl -LsSf https://astral.sh/uv/install.sh | sh
    source $HOME/.local/bin/env
    

    Option 2: Install via pip

    pip install uv
    
    • Optional: Enable shell completions
    echo 'eval "$(uv generate-shell-completion bash)"' >> ~/.bashrc
    echo 'eval "$(uvx --generate-shell-completion bash)"' >> ~/.bashrc
    

    Note: "Drop-in" means you can use uv in place of the original tools (e.g., pip, venv) without changing your workflow.

Installation Options

Option 1: Install LLM Agentic Tool Mesh Services Only

If you only need the core LLM Agentic Tool Mesh services without the example applications, you can install them directly via uv pip:

uv pip install -e '.[all]'

After installation, refer to the Usage Guide for instructions on using platform services.

Option 2: Full Example Setup

To use the complete setup, including examples and demo applications, follow these steps:

  1. Clone the Repository: Download the LLM Agentic Tool Mesh repository to your local machine.

    git clone https://github.com/HewlettPackard/llmesh.git
    cd llmesh
    
  2. Install Dependencies: All dependencies required by the platform are specified in the pyproject.toml file. Use the following commands to install them:

# Install with all extras
uv pip install -e ".[all]"

# Install with specific extras
uv pip install -e ".[chat,agents,rag]"

# Install with development/testing dependencies
uv pip install -e ".[all,test]"
  1. Setup for Specific Tools: Some tools, including tool_rag, tool_agents, and tool_analyzer, require additional setup (e.g., copying specific data files and initializing configurations). For detailed setup instructions, refer to the Installation Guide.

Running the UIs

You can run the tools and web applications individually or use the provided script src/infra/scripts/start_examples.sh to run them all together. Once everything is started, you can access the chatbot app at https://127.0.0.1:5001/ and the back panel at https://127.0.0.1:5011/.

Running the Games

You can run the game web application individually or use the provided script run_games.sh to run them all together. Once everything is started, you can access the chatbot app at https://127.0.0.1:5001/. Have fun :) !!!

References

For more details about installation, usage, and advanced configurations, please visit the LLM Agentic Tool Mesh project Wiki.

Contact

If you have any questions or need further assistance, feel free to contact me at antonio.fin@hpe.com.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentic_python-0.2.0.tar.gz (57.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentic_python-0.2.0-py3-none-any.whl (121.9 kB view details)

Uploaded Python 3

File details

Details for the file agentic_python-0.2.0.tar.gz.

File metadata

  • Download URL: agentic_python-0.2.0.tar.gz
  • Upload date:
  • Size: 57.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for agentic_python-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f9966b07dfd268a303e992708b92ac1c90153ab16ffd650c653b85df8e5a85b2
MD5 3c5e1d6d94a60f5a4464a6091f9472f4
BLAKE2b-256 0938f9a2615efa29b0a64fe344963befb9665a37df4e01929744add46504264c

See more details on using hashes here.

File details

Details for the file agentic_python-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: agentic_python-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 121.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for agentic_python-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c541294b191d4c7eef4b55d4445ddec83029267f8f03c23d97b13c960669601c
MD5 1221b069bc53092e8388479ce7a90580
BLAKE2b-256 5cda6130a3ba856e6daa0d7305f7fedef67279ba6480b05d53d561a038b50e32

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page