Multi-Agent Reasoning Problem Solver
Project description
Multi-Agent Reasoning Problem Solver (MAR-PS)
MAR-PS is a multi-agent reasoning problem solver. You build teams and they work together to solve the problems you give them.
You can work with them as a member of their team.
Install
It can be installed via pip
with the following command:
pip install mar-ps
Backends
Currently, MAR-PS supports both Ollama and OpenAI as backends. We plan to add support for other backends in the future.
Usage
Here is an example using an Ollama backend. Fist, let's setup the Ollama client and use it to create a model.
from mar_ps import (
OllamaClient, # Ollama API client
MAR, # Multi-Agent Reasoning system class
Model, # the model class
system, # the system entity, for giving the initial system prompt
Message, # the message class
)
ollama_client = OllamaClient()
model = Model("llama3.1", ollama_client)
Note, you can have some models from one backend and some from another. For example, you could have Claude 3.5 Sonnet as a coding expert and a local model for creativity. (Claude runs on the OpenAI client). Next, let's create the MAR and add some entities.
mar = MAR(model) # This sets the default model.
logic_expert = mar.Entity(
"Logic Expert",
"an expert in logic and reasoning", # lowercase first letter and no end punctuation. See the system prompt to understand why.
)
math_expert = mar.Entity(
"Math Expert",
"an expert in math and solving problems",
)
In practice, you will likely want to use different models for different entities to play to their strengths. Now, make sure to add a user entity.
user = mar.Entity(
"User",
"the one who gives problems and instructions",
"",
is_user=True,
pin_to_all_models=True, # all messages sent by this user will be pinned for all models to see.
)
By setting is_user=True
, whenever a message is sent to the user, you will be prompted to respond.
Now let's give them a system prompt. Make sure to tell them who is on their team and who they are. If you don't do this, it won't work. I have found the following system prompt to work the best.
for entity in mar.entities:
entity.message_stack.append(
Message(
system,
entity,
f"This is the messaging application. Your team includes: {'\n'.join([f'{e.id}: {e.introduction[0].upper()+e.introduction[1:]}.' for e in mar.entities if e != entity])}. You may address messages to any of them and receive messages from any of them. You may not send messages to anyone outside of your team. Your messages are private; only the sender and receiver can see them. Thus, you will need to share information with your teammates. There can only be one recipient per message, the messaging application does not support sending messages to multiple recipients at once. You are {entity.id}, {entity.introduction}. {entity.personal_prompt + " " if entity.personal_prompt else ''}Messages sent by you are started with To: and messages sent to you are started with From:.",
)
)
And finally start the chat by sending a message.
mar.start(logic_expert.send(input("You: "), user, print_all_messages=True))
By setting print_all_messages
to True
, it allows us to see all the messages sent. Otherwise, we would only see the messages sent to the user.
See simple_example.py
for the full code.
TODO
Features to add
-
TODO: add tool support
-
TODO: add streaming support
Backends to add
- MLX-ENGINE (https://github.com/lmstudio-ai/mlx-engine/)
- Transformers (https://github.com/huggingface/transformers/)
- CoreML (https://github.com/apple/coremltools/)
Hard ones to add
-
TODO: add support for multi-recipient messages
-
TODO: add support for multi-message responses
NOTE: These will be VERY difficult to implement because every time an entity receives a message, it tries to reply. If you send a message to many entities, they will all try to reply.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mar_ps-0.0.2.tar.gz
.
File metadata
- Download URL: mar_ps-0.0.2.tar.gz
- Upload date:
- Size: 17.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.13.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fa64eb614e043518b727290c116203d5579c47c007303a784991898ea1602dcc |
|
MD5 | 10c4d58eeed3e22fb6e33aff47b4807c |
|
BLAKE2b-256 | 82e19b71a823fd80f65ea9d1bed96e16b40c0867693aca732470ba257e3e2a6f |
File details
Details for the file mar_ps-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: mar_ps-0.0.2-py3-none-any.whl
- Upload date:
- Size: 17.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.13.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0ff88fbb32a6443e41ec8a5af672d0f6c865ef924f848135f7eec544eae069f3 |
|
MD5 | 538b6934bcbb27f5eedfe35272655d8a |
|
BLAKE2b-256 | 53dbd167ef8173fc25c3ce3208dc65f084d0fba56ea4ee6acd3e16211d24c8d2 |