Agents hosting adapter for Azure AI
Project description
Supported frameworks
- [WIP] langgraph
- [WIP] Microsoft Agent Framework
Install
In current folder, run:
pip install -e .
Usage
langgraph
# your existing agent
from my_langgraph_agent import my_awesome_agent
# langgraph utils
from azure.ai.agentshosting import from_langgraph
if __name__ == "__main__":
# with this simple line, your agent will be hosted on http://localhost:8088
from_langgraph(my_awesome_agent).run()
Note
If your langgraph agent was not using langgraph's builtin MessageState, you should implement your own LanggraphStateConverter and provide to from_langgraph.
Reference this example for more details.
Microsoft Agent Framework
# your existing agent
from my_framework_agent import my_awesome_agent
# agent framework utils
from azure.ai.agentshosting import from_agent_framework
if __name__ == "__main__":
# with this simple line, your agent will be hosted on http://localhost:8088
from_agent_framework(my_awesome_agent).run()
Custom Code
If your agent is not built using a supported framework, you can still make it compatible with Microsoft AI Foundry by manually implementing the predefined interface.
import datetime
from azure.ai.agentshosting import FoundryCBAgent
from azure.ai.agentshosting.models.azureaiagents.models import CreateResponse
from azure.ai.agentshosting.models.openai.models import (
ItemContentOutputText,
Response as OpenAIResponse,
ResponsesAssistantMessageItemResource,
ResponseTextDeltaEvent,
ResponseTextDoneEvent,
)
def stream_events(text: str):
assembled = ""
for i, token in enumerate(text.split(" ")):
piece = token if i == len(text.split(" ")) - 1 else token + " "
assembled += piece
yield ResponseTextDeltaEvent(delta=piece)
# Done with text
yield ResponseTextDoneEvent(text=assembled)
async def agent_run(request_body: CreateResponse):
agent = request_body.agent
print(f"agent:{agent}")
if request_body.stream:
return stream_events("I am mock agent with no intelligence in stream mode.")
# Build assistant output content
output_content = [
ItemContentOutputText(
text="I am mock agent with no intelligence.",
annotations=[],
)
]
response = OpenAIResponse(
metadata={},
temperature=0.0,
top_p=0.0,
user="me",
id="id",
created_at=datetime.datetime.now(),
output=[
ResponsesAssistantMessageItemResource(
status="completed",
content=output_content,
)
],
)
return response
my_agent = FoundryCBAgent()
my_agent.agent_run = agent_run
if __name__ == "__main__":
my_agent.run()
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file my_agents_adapter-0.0.8.tar.gz.
File metadata
- Download URL: my_agents_adapter-0.0.8.tar.gz
- Upload date:
- Size: 148.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cf95d79f1ea67b83d2a43f82818bf1d6dbb198dafdb20459653a3c8a19c87af9
|
|
| MD5 |
490a646c35e587f020090fd21edb71f8
|
|
| BLAKE2b-256 |
fc8da5876ae161cbb2499c93eae055a18f9e36a04798913e62752c3a5cb2ddfb
|
File details
Details for the file my_agents_adapter-0.0.8-py3-none-any.whl.
File metadata
- Download URL: my_agents_adapter-0.0.8-py3-none-any.whl
- Upload date:
- Size: 176.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1209934e6d70beb84aa607a46f1e8d78ae3c590da9ee1c79759365de69f63c02
|
|
| MD5 |
7c7fc21172a7466f81cb581c2ac12901
|
|
| BLAKE2b-256 |
64035b89678eb6af2ed1b3f2b91ca5572f95c618329e28ca3024008d2dbaf4fa
|