Skip to main content

Foundation utilities and host framework for Azure AI Hosted Agents

Project description

Azure AI Agent Server Core client library for Python

The azure-ai-agentserver-core package provides the foundation host framework for building Azure AI Hosted Agent containers. It handles the protocol-agnostic infrastructure — health probes, graceful shutdown, OpenTelemetry tracing, and ASGI serving — so that protocol packages can focus on their endpoint logic.

Getting started

Install the package

pip install azure-ai-agentserver-core

OpenTelemetry tracing with Azure Monitor and OTLP exporters is included by default.

Prerequisites

  • Python 3.10 or later

Key concepts

AgentServerHost

AgentServerHost is the host process for Azure AI Hosted Agent containers. It provides:

  • Health probeGET /readiness returns 200 OK when the server is ready.
  • Graceful shutdown — On SIGTERM the server drains in-flight requests (default 30 s timeout) before exiting.
  • OpenTelemetry tracing — Automatic span creation with Azure Monitor and OTLP export when configured.
  • Hypercorn ASGI server — Serves on 0.0.0.0:${PORT:-8088} with HTTP/1.1.

Protocol packages (e.g. azure-ai-agentserver-invocations) subclass AgentServerHost and add their endpoints in __init__.

Environment variables

Variable Description Default
PORT Listen port 8088
FOUNDRY_AGENT_NAME Agent name (used in tracing) ""
FOUNDRY_AGENT_VERSION Agent version (used in tracing) ""
FOUNDRY_PROJECT_ENDPOINT Azure AI Foundry project endpoint ""
FOUNDRY_PROJECT_ARM_ID Foundry project ARM resource ID (used in tracing) ""
FOUNDRY_AGENT_SESSION_ID Default session ID when not provided per-request ""
APPLICATIONINSIGHTS_CONNECTION_STRING Azure Monitor connection string
OTEL_EXPORTER_OTLP_ENDPOINT OTLP collector endpoint

Examples

AgentServerHost is typically used via a protocol package. The simplest setup with the invocations protocol:

from azure.ai.agentserver.invocations import InvocationAgentServerHost
from starlette.responses import JSONResponse

app = InvocationAgentServerHost()

@app.invoke_handler
async def handle(request):
    body = await request.json()
    return JSONResponse({"greeting": f"Hello, {body['name']}!"})

app.run()

Subclassing AgentServerHost

For custom protocol implementations, subclass AgentServerHost and add routes:

from azure.ai.agentserver.core import AgentServerHost
from starlette.requests import Request
from starlette.responses import JSONResponse
from starlette.routing import Route

class MyAgentHost(AgentServerHost):
    def __init__(self, **kwargs):
        my_routes = [Route("/my-endpoint", self._handle, methods=["POST"])]
        existing = list(kwargs.pop("routes", None) or [])
        super().__init__(routes=existing + my_routes, **kwargs)

    async def _handle(self, request: Request):
        return JSONResponse({"status": "ok"})

app = MyAgentHost()
app.run()

Shutdown handler

Register a cleanup function that runs during graceful shutdown:

app = AgentServerHost()

@app.shutdown_handler
async def on_shutdown():
    # Close database connections, flush buffers, etc.
    pass

Configuring tracing

Tracing is enabled automatically when an Application Insights connection string is available:

app = AgentServerHost(
    applicationinsights_connection_string="InstrumentationKey=...",
)

Or via environment variable:

export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=..."
python my_agent.py

Troubleshooting

Logging

Set the log level to DEBUG for detailed diagnostics:

app = AgentServerHost(log_level="DEBUG")

Reporting issues

To report an issue with the client library, or request additional features, please open a GitHub issue here.

Next steps

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information, see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure_ai_agentserver_core-2.0.0b3.tar.gz (42.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

azure_ai_agentserver_core-2.0.0b3-py3-none-any.whl (29.1 kB view details)

Uploaded Python 3

File details

Details for the file azure_ai_agentserver_core-2.0.0b3.tar.gz.

File metadata

File hashes

Hashes for azure_ai_agentserver_core-2.0.0b3.tar.gz
Algorithm Hash digest
SHA256 e295b19a65d53c513929f52f0862bbb815cc9e9fc29d2a2825452f3136260123
MD5 e5d820c436a0e2fab21ce619ca9afa23
BLAKE2b-256 84291a9606d5252b02d77070a1b633dd0c26fe65a0f4a0fb0cfdaa751e2ed458

See more details on using hashes here.

File details

Details for the file azure_ai_agentserver_core-2.0.0b3-py3-none-any.whl.

File metadata

File hashes

Hashes for azure_ai_agentserver_core-2.0.0b3-py3-none-any.whl
Algorithm Hash digest
SHA256 5ef921eb9fd9c0f15682fe930320fae50dccfa915d7518f9a16d99014bbcb3cb
MD5 74155e0759c9653227fb1d06182d79c7
BLAKE2b-256 7f9b1fc87c05b55821f33c46c5e8a3b97a573aa2fc4bff387e75cca1a87800b4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page