Llama-stack integration for Next Gen UI Agent
Project description
Next Gen UI Llama Stack Integration
This module is part of the Next Gen UI Agent project.
Support for Llama Stack framework.
Provides
NextGenUILlamaStackAgent- takes all tool messages from provided conversation turn steps (Llama Stack Agent API), and process data from them into UI components.LlamaStackAgentInferenceandLlamaStackAsyncAgentInferenceto use LLM hosted in Llama Stack server (Llama Stack Chat Completion API)
Tool name is used as InputData.type for each tool message, so can be used for Hand Build Component selection based on mapping in UI Agent's configuration.
Installation
pip install -U next_gen_ui_llama_stack
Example
Integrate Next Gen UI with your assistent
Let's have your ReAct Agent e.g. Movies agent like this:
from llama_stack_client.lib.agents.react.agent import ReActAgent
client = LlamaStackClient(
base_url=f"http://{LLAMA_STACK_HOST}:{LLAMA_STACK_PORT}",
)
INFERENCE_MODEL = "meta-llama/Llama-3.2-3B-Instruct"
movies_agent = ReActAgent(
client=client,
model=INFERENCE_MODEL,
client_tools=[
movies,
],
json_response_format=True,
)
session_id = movies_agent.create_session("test-session")
# Send a query to your agent
response = movies_agent.create_turn(
messages=[{"role": "user", "content": user_input}],
session_id=session_id,
stream=False,
)
Use NextGenUILlamaStackAgent class and just pass llama stack client and model name and
pass steps from your movies agent to Next Gen UI Agent.
from next_gen_ui_llama_stack import NextGenUILlamaStackAgent
# Pass steps to Next Gen UI Agent
ngui_agent = NextGenUILlamaStackAgent(client, INFERENCE_MODEL)
result = await ngui_agent.turn_from_steps(user_input, steps=response.steps)
Links
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file next_gen_ui_llama_stack-0.2.2-py3-none-any.whl.
File metadata
- Download URL: next_gen_ui_llama_stack-0.2.2-py3-none-any.whl
- Upload date:
- Size: 5.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
853e000162234437af44192c3141a99116a9333effc901460b3b8e6c28634aba
|
|
| MD5 |
e77a8604c04d576d60a9bb45339b214f
|
|
| BLAKE2b-256 |
c164d12b7556cf38dcf4f1eaa8ffab468d1d79ee2eb72f8efaa4d1e9c8daa537
|