Skip to main content

llama-index agent azure integration

Project description

LlamaIndex Azure Foundry Agent Integration

This package provides an Azure Foundry Agent integration for LlamaIndex. It allows you to leverage Azure AI Agent Service capabilities within your LlamaIndex applications. The provided AzureFoundryAgent inherits BaseWorkflowAgent from LlamaIndex, making it compatible with workflow-based multi-agent orchestration.

About Azure AI Agent Service

Azure AI Agent Service is a fully managed service designed to empower developers to securely build, deploy, and scale high-quality, and extensible AI agents without needing to manage the underlying compute and storage resources.

Installation

You can install the package via pip:

pip install llama-index-agent-azure

or if working from source:

cd llama_index/llama-index-integrations/agent/llama-index-agent-azure
pip install -e .

You may also want to install python-dotenv if you plan to use a .env file for environment variables:

pip install python-dotenv

Prerequisites

Before using this integration, ensure you have:

  1. An Azure account and a provisioned Azure OpenAI service or an Azure AI Project with an agent-compatible endpoint.
  2. The necessary environment variables set up for authentication. Typically, this involves:
    • AZURE_PROJECT_ENDPOINT: Your Azure AI Project endpoint.
    • Standard Azure authentication environment variables recognized by DefaultAzureCredential (e.g., AZURE_CLIENT_ID, AZURE_TENANT_ID, AZURE_CLIENT_SECRET, or ensure you are logged in via Azure CLI).

Usage

Here's a basic example of how to use the AzureFoundryAgent with a function tool for function calling.

from llama_index.agent.azure_foundry_agent import AzureFoundryAgent
from dotenv import load_dotenv
import os

load_dotenv()

# Configure your Azure project endpoint
azure_project_endpoint = os.environ.get("AZURE_PROJECT_ENDPOINT")

if not azure_project_endpoint:
    raise ValueError("AZURE_PROJECT_ENDPOINT environment variable not set.")


# Define a sample tool (optional)
def get_weather(location: str) -> str:
    """Get the weather for a given location."""
    # This is a placeholder function. Replace with actual weather API call.
    return f"The weather in {location} is sunny."


# Instantiate the agent
# Note: The `model` parameter refers to a model deployment that should already be created in your Azure AI Project.
agent = AzureFoundryAgent(
    endpoint=azure_project_endpoint,
    model="gpt-4o",  # Specify your deployed model name
    name="my-azure-agent",
    instructions="You are a helpful assistant that can provide information and use tools.",
    verbose=True,
    tools=[get_weather],  # Pass your defined tools as a list
    run_retrieve_sleep_time=2,  # Time in seconds to wait between polling run status
)

# Run the agent
response = await agent.run(
    "What is the capital of France and what is the weather there?"
)
print("Agent Response:", response)

Example of using multimodal input with the agent:

# Example: Multimodal input (text + image)
# This works with multimodal-capable models such as gpt-4o
from llama_index.core.llms import ChatMessage, TextBlock, ImageBlock

multimodal_msg = ChatMessage(
    role="user",
    blocks=[
        TextBlock(text="Describe what you see in this image."),
        ImageBlock(url="https://example.com/sample-image.png"),
    ],
)
multimodal_response = await agent.run(multimodal_msg)
print("Multimodal Agent Response:", multimodal_response)


# Important: Azure agents and threads are stateful resources on Azure.
# Remember to clean them up from the Azure portal or using the Azure SDK
# await agent._client.agents.delete_agent(agent_id=agent._agent.id)
# await agent._client.agents.threads.delete(thread_id=agent._thread_id)

Key Parameters for AzureFoundryAgent

  • endpoint: The endpoint URL for your Azure AI Project or compatible service.
  • model: The identifier of the LLM model to be used by the agent (e.g., "gpt-4o", "gpt-35-turbo").
  • name: A name for your agent instance.
  • instructions: System instructions for the agent.
  • tools (Optional): A list of Python functions to be used as tools by the agent.
  • thread_id (Optional): An existing thread ID to continue a conversation. If not provided, a new thread is created.
  • verbose (Optional): Set to True for detailed logging.
  • run_retrieve_sleep_time (Optional): The time in seconds to wait between polling the status of an agent run. Defaults to 1.0.

Troubleshooting

  • Missing Environment Variables: Ensure AZURE_PROJECT_ENDPOINT and Azure credentials are set in your environment or .env file.
  • Resource Cleanup: Always delete agents and threads after use to avoid resource leaks and unnecessary Azure charges.
  • Dependency Issues: Make sure all required packages are installed, including python-dotenv if using .env files.

For more details, see the Azure AI Agent Service documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_agent_azure_foundry-0.2.1.tar.gz (8.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_agent_azure_foundry-0.2.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_agent_azure_foundry-0.2.1.tar.gz
Algorithm Hash digest
SHA256 9e9668e6c94f5bed153abc3e733f0a2a6f600cdce910d1d887e9b45a00f038c4
MD5 efecf7c1cad714d9cc7ee64406f23fc3
BLAKE2b-256 ac2066bb895d7a33301707eb587bfb85036e9e90917e56788b7d7389ca32cbe4

See more details on using hashes here.

File details

Details for the file llama_index_agent_azure_foundry-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_agent_azure_foundry-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4a37467a3036516ff9a3d5dda84633920e8ff6b8efb373984708f07031e06f1a
MD5 e4554da36dfb3a9db803aa0aaf64fb96
BLAKE2b-256 404f2e37655ad678516b316c97007115a20e4a81fef7b3b31c5c90cdd1edc1e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page