Skip to main content

Foundation utilities and host framework for Azure AI Hosted Agents

Project description

Azure AI Agent Server Core client library for Python

The azure-ai-agentserver-core package provides the foundation host framework for building Azure AI Hosted Agent containers. It handles the protocol-agnostic infrastructure — health probes, graceful shutdown, OpenTelemetry tracing, and ASGI serving — so that protocol packages can focus on their endpoint logic.

Getting started

Install the package

pip install azure-ai-agentserver-core

OpenTelemetry tracing with Azure Monitor and OTLP exporters is included by default.

Prerequisites

  • Python 3.10 or later

Key concepts

AgentServerHost

AgentServerHost is the host process for Azure AI Hosted Agent containers. It provides:

  • Health probeGET /readiness returns 200 OK when the server is ready.
  • Graceful shutdown — On SIGTERM the server drains in-flight requests (default 30 s timeout) before exiting.
  • OpenTelemetry tracing — Automatic span creation with Azure Monitor and OTLP export when configured.
  • Hypercorn ASGI server — Serves on 0.0.0.0:${PORT:-8088} with HTTP/1.1.

Protocol packages (e.g. azure-ai-agentserver-invocations) subclass AgentServerHost and add their endpoints in __init__.

Environment variables

Variable Description Default
PORT Listen port 8088
FOUNDRY_AGENT_NAME Agent name (used in tracing) ""
FOUNDRY_AGENT_VERSION Agent version (used in tracing) ""
FOUNDRY_PROJECT_ENDPOINT Azure AI Foundry project endpoint ""
FOUNDRY_PROJECT_ARM_ID Foundry project ARM resource ID (used in tracing) ""
FOUNDRY_AGENT_SESSION_ID Default session ID when not provided per-request ""
APPLICATIONINSIGHTS_CONNECTION_STRING Azure Monitor connection string
OTEL_EXPORTER_OTLP_ENDPOINT OTLP collector endpoint

Examples

AgentServerHost is typically used via a protocol package. The simplest setup with the invocations protocol:

from azure.ai.agentserver.invocations import InvocationAgentServerHost
from starlette.responses import JSONResponse

app = InvocationAgentServerHost()

@app.invoke_handler
async def handle(request):
    body = await request.json()
    return JSONResponse({"greeting": f"Hello, {body['name']}!"})

app.run()

Subclassing AgentServerHost

For custom protocol implementations, subclass AgentServerHost and add routes:

from azure.ai.agentserver.core import AgentServerHost
from starlette.requests import Request
from starlette.responses import JSONResponse
from starlette.routing import Route

class MyAgentHost(AgentServerHost):
    def __init__(self, **kwargs):
        my_routes = [Route("/my-endpoint", self._handle, methods=["POST"])]
        existing = list(kwargs.pop("routes", None) or [])
        super().__init__(routes=existing + my_routes, **kwargs)

    async def _handle(self, request: Request):
        return JSONResponse({"status": "ok"})

app = MyAgentHost()
app.run()

Shutdown handler

Register a cleanup function that runs during graceful shutdown:

app = AgentServerHost()

@app.shutdown_handler
async def on_shutdown():
    # Close database connections, flush buffers, etc.
    pass

Configuring tracing

Tracing is enabled automatically when an Application Insights connection string is available:

app = AgentServerHost(
    applicationinsights_connection_string="InstrumentationKey=...",
)

Or via environment variable:

export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=..."
python my_agent.py

Troubleshooting

Logging

Set the log level to DEBUG for detailed diagnostics:

app = AgentServerHost(log_level="DEBUG")

Reporting issues

To report an issue with the client library, or request additional features, please open a GitHub issue here.

Next steps

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information, see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure_ai_agentserver_core-2.0.0b2.tar.gz (41.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

azure_ai_agentserver_core-2.0.0b2-py3-none-any.whl (27.5 kB view details)

Uploaded Python 3

File details

Details for the file azure_ai_agentserver_core-2.0.0b2.tar.gz.

File metadata

File hashes

Hashes for azure_ai_agentserver_core-2.0.0b2.tar.gz
Algorithm Hash digest
SHA256 cc6c90fdc4c2b2ce594f0e85288fda84910c04939d1427a64a485b2d48d6d684
MD5 da83627ed1f63e7932e9912339bb3580
BLAKE2b-256 a02525865cfa76cbc20c18c4e9ed337456fd7374c01e930dd151463b4c183ac0

See more details on using hashes here.

File details

Details for the file azure_ai_agentserver_core-2.0.0b2-py3-none-any.whl.

File metadata

File hashes

Hashes for azure_ai_agentserver_core-2.0.0b2-py3-none-any.whl
Algorithm Hash digest
SHA256 931e7a2d82275a01d7eb5ef08a70dba230938e3646be64c03d82749dd7be8afc
MD5 20390737ac180ff97b525d83e4b794a2
BLAKE2b-256 6935cf8a034f86d653fa902edb5ffa0a86005ea941f2840d2fa27302484856c1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page