autogen ui: a ui interface for the autogen library
Project description
AutoGen UI
Experimental UI for working with AutoGen agents, based on the AutoGen library. The UI is built using Next.js and web apis built using FastApi.
Why AutoGen UI?
AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve complex tasks. A UI can help in the development of such applications by enabling rapid prototypingand testing and debugging of agents/agent flows (defining, composing etc) inspecting agent behaviors, and agent outcomes.
Note: This is early work in progress.
Note that you will have to setup your OPENAI_API_KEY or general llm config using an environment variable. Also See this article for how Autogen supports multiple llm providers
export OPENAI_API_KEY=<your key>
Getting Started
Install dependencies. Python 3.9+ is required. You can install from pypi using pip.
pip install autogenui .
or to install from source
git clone git@github.com:victordibia/autogen-ui.git
cd autogenui
pip install -e .
Run ui server.
autogenui # or with --port 8081
Open http://localhost:8081 in your browser.
To modify the source files, make changes in the frontend source files and run npm run build
to rebuild the frontend.
Roadmap
- FastApi end point for AutoGen. This involves setting up a FastApi endpoint that can respond to end user prompt based requests using a basic two agent format.
- Basic Chat UI
Front end UI with a chatbox to enable sending requests and showing responses from the end point for a basic 2 agent format.
- Debug Tools: enable support for useful debugging capabilities like viewing
- # of agent turns per request
- define agent config (e.g. assistant agent + code agent)
- append conversation history per request
- display cost of interaction per request (# tokens and $ cost)
- Debug Tools: enable support for useful debugging capabilities like viewing
- Streaming UI Enable streaming of agent responses to the UI. This will enable the UI to show agent responses as they are generated, instead of waiting for the entire response to be generated.
- Flow based Playground UI
Explore the use of a tool like React Flow to add agent nodes and compose agent flows. For example, setup an assistant agent + a code agent, click run and view output in a chat window.- Create agent nodes
- Compose agent nodes into flows
- Run agent flows
- Explore external integrations e.g. with Flowise
References
@inproceedings{wu2023autogen,
title={AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework},
author={Qingyun Wu and Gagan Bansal and Jieyu Zhang and Yiran Wu and Shaokun Zhang and Erkang Zhu and Beibin Li and Li Jiang and Xiaoyun Zhang and Chi Wang},
year={2023},
eprint={2308.08155},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file autogenui-0.0.4a0.tar.gz
.
File metadata
- Download URL: autogenui-0.0.4a0.tar.gz
- Upload date:
- Size: 730.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec1da2972cb2e02154d208c27d6a2dcd486a3d27f10952f4f3f6823069ea8962 |
|
MD5 | 74de26ae6e4e04c52d972addd58e8c48 |
|
BLAKE2b-256 | f9b793060e6dfa637bce34a6a9a223cfdb812731ade48de114d0e3d3ffe3039f |
File details
Details for the file autogenui-0.0.4a0-py3-none-any.whl
.
File metadata
- Download URL: autogenui-0.0.4a0-py3-none-any.whl
- Upload date:
- Size: 725.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 40d9c4b1440404ab23597c1d3b6383782b119f5eb9e8a65aa4138becdeca51e6 |
|
MD5 | 226e5b75c69f42d5abf7d116c5205d00 |
|
BLAKE2b-256 | bc548edbb19a441ee1e6a0a613a8ab37fa7fe213d30cebbdd343b1918c28cfa6 |