Skip to main content

LLM Group Chat Framework: chat with multiple LLMs at the same time

Project description

PyPI Docker Image Version (tag latest semver) GitHub commit activity PyPI - Downloads

English | 简体中文

Latest Progress 🎉

  • [February 2024] Add mistral-7b model
  • [February 2024] Add gemini-pro model
  • [January 2024] refactor the config-template.yaml to control the backend and the frontend settings at the same time, click to find more introduction about the config-template.yaml
  • [January 2024] Add internlm2-chat-7b model
  • [January 2024] Released version v0.0.1, officially open source!

Introduction

What is OpenAOE?

AOE, an acronym from DOTA2 for Area Of Effect, denotes an ability that can affect a group of targets within a certain area. Here, AOE in AI implies that user can obtain parallel outputs from multiple LLMs with one single prompt at the same time.

What problem does OpenAOE want to solve?

Currently, there are many open-source frameworks based on the ChatGPT for chat, but the LGC(LLM Group Chat) framework is still not coming yet.

The emergence of OpenAOE fills this gap: OpenAOE can help LLM researchers, evaluators, engineering developers, and even non-professionals to quickly access the market's well-known commercial and open-source LLMs, providing both single model serial response mode and multi-models parallel response mode.

What can you get from OpenAOE?

OpenAOE can:

  1. return one or more LLMs' answers at the same time by a single prompt.
  2. provide access to commercial LLM APIs, with default support for gpt3.5, gpt4, Google Palm, Minimax, Claude, Spark, etc., and also support user-defined access to other large model APIs. (API keys need to be prepared in advanced)
  3. provide access to open-source LLM APIs. ( We recommend to use LMDeploy to deploy with one click)
  4. provide backend APIs and a WEB-UI to meet the needs of different requirements.

Quick Run

[!TIP] Require python >= 3.9

We provide three different ways to run OpenAOE: run by piprun by docker and run by source code as well.

Run by pip

Install

pip install -U openaoe 

Start

openaoe -f /path/to/your/config-template.yaml

Run by docker

Install

There are two ways to get the OpenAOE docker image by:

  1. pull the OpenAOE docker image
docker pull opensealion/openaoe:latest
  1. or build a docker image
git clone https://github.com/internlm/OpenAOE
cd OpenAOE
docker build . -f docker/Dockerfile -t opensealion/openaoe:latest

Start

docker run -p 10099:10099 -v /path/to/your/config-template.yaml:/app/config.yaml --name OpenAOE opensealion/openaoe:latest

Run by source code

Install

  1. clone this project
git clone https://github.com/internlm/OpenAOE
  1. [optional] build the frontend project when the frontend codes are changed
cd OpenAOE/openaoe/frontend
npm install
npm run build

Start

cd OpenAOE # this OpenAOE is the clone directory
pip install -r openaoe/backend/requirements.txt
python -m openaoe.main -f /path/to/your/config-template.yaml

[!TIP] /path/to/your/config-template.yaml is a configuration file loaded by OpenAOE at startup, which contains the relevant configuration information for the LLMs, including: API URLs, AKSKs, Tokens, etc. A template configuration yaml file can be found in openaoe/backend/config/config-template.yaml. Note that, this config-template.yaml DOES NOT contain any API access data, you should add them by yourself.

Tech Report

You are always welcome to fork this project to contribute your work and find the TODOs in furture.

If you want to add more LLMs' APIs or features based on OpenAOE, the following info might be helpful.

Tech Stack

The technology stack we use includes:

  1. Backend framework based on python + fastapi;
  2. Frontend framework based on typescript + Sealion-Client (encapsulated based on React) + Sealion-UI.
  3. Build tools:
    1. conda: quickly create a virtual python env to install necessary packages
    2. npm: build the frontend project

[!TIP] The build tools can be installed quickly by pip install -U sealion-cli

Organization of the Repo

  • Frontend codes are in openaoe/frontend
  • Backend codes are in openaoe/backend
  • Project entry-point is openaoe/main.py

How to add a new model

Frontend

  • Add new model info like name, avatar, provider, etc in openaoe/frontend/src/config/model-config.ts
  • Add a new model basic API request payload configuration in openaoe/frontend/src/config/api-config.ts
  • Modify your new model's payload specifically in openaoe/frontend/src/services/fetch.ts, you may need to change the payload structure and handle corner cases according to your model's API definition.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openaoe-0.0.6.tar.gz (4.7 MB view details)

Uploaded Source

Built Distribution

openaoe-0.0.6-py3-none-any.whl (4.7 MB view details)

Uploaded Python 3

File details

Details for the file openaoe-0.0.6.tar.gz.

File metadata

  • Download URL: openaoe-0.0.6.tar.gz
  • Upload date:
  • Size: 4.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for openaoe-0.0.6.tar.gz
Algorithm Hash digest
SHA256 31301d8cbefe6c2959e73f9a81ef50bf3662305fe650737367ab9e375990a064
MD5 94f874487538ecb475d3d58f13d795ab
BLAKE2b-256 80e4ffb871615ce71bf0a3f712e83a0e751930f7472049739acbedbe0fbc3a35

See more details on using hashes here.

File details

Details for the file openaoe-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: openaoe-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 4.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for openaoe-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 4bb2e7fcdfac11c6a8ebd8d69ec533d58ae70f3e79844beb4337eeb51be8fe5e
MD5 801bfc233f9b7865733ec75c26aa29d9
BLAKE2b-256 8335d50e9f430b3274f3434a6b5a57b3f260889bdcadd5a4e76718475ec23a9f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page