Multimodal Large Language Models
Project description
MLLM
Multimodal Large Language Models
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
Installation
pip install mllm
Usage
Create an MLLM router with a list of preferred models
import os
from mllm import Router
os.environ["OPENAI_API_KEY"] = "..."
os.environ["ANTHROPIC_API_KEY"] = "..."
os.environ["GEMINI_API_KEY"] = "..."
router = Router(
preference=["gpt-4-turbo", "anthropic/claude-3-opus-20240229", "gemini/gemini-pro-vision"]
)
Create a new role based chat thread
from mllm import RoleThread
thread = RoleThread(owner_id="dolores@agentsea.ai")
thread.post(role="user", msg="Describe the image", images=["data:image/jpeg;base64,..."])
Chat with the MLLM, store the prompt data in the namespace foo
response = router.chat(thread, namespace="foo")
thread.add_msg(response.msg)
Ask for a structured response
from pydantic import BaseModel
class Animal(BaseModel):
species: str
color: str
thread.post(
role="user",
msg=f"What animal is in this image? Please output as schema {Animal.model_json_schema()}"
images=["data:image/jpeg;base64,..."]
)
response = router.chat(thread, namespace="animal", expect=Animal)
animal_parsed = response.parsed
assert type(animal_parsed) == Animal
Find a saved thread or a prompt
RoleThread.find(id="123")
Prompt.find(id="456)
To store a raw openai prompt
from mllm import Prompt, RoleThread
thread = RoleThread()
msg = {
"role": "user",
"content": [
{
"type": "text",
"text": "Whats in this image?",
},
{
"type": "image_url",
"image_url": {"url": f"data:image/jpeg;base64,..."},
}
]
}
role_message = RoleMessage.from_openai(msg)
thread.add_msg(role_message)
response = call_openai(thread.to_openai())
response_msg = RoleMessage.from_openai(response["choices"][0]["message"])
saved_prompt = Prompt(thread, response_msg, namespace="foo")
Backends
Thread and prompt storage can be backed by:
- Sqlite
- Postgresql
Sqlite will be used by default. To use postgres simply configure the env vars:
DB_TYPE=postgres
DB_NAME=mllm
DB_HOST=localhost
DB_USER=postgres
DB_PASS=abc123
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mllm-0.1.31.tar.gz
(9.5 kB
view details)
Built Distribution
mllm-0.1.31-py3-none-any.whl
(10.7 kB
view details)
File details
Details for the file mllm-0.1.31.tar.gz
.
File metadata
- Download URL: mllm-0.1.31.tar.gz
- Upload date:
- Size: 9.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bb308a6f490e4d6ccd9504e4bfbe1e5326e2d4a5246b5fcb1e3c184b68f46813 |
|
MD5 | 6230dee3389d9490f0f3fbf9938d7c14 |
|
BLAKE2b-256 | 8a56e831f4048425c1f23961f808e0a77bab8ae4be0a63ba101a8ca4ea76bd28 |
File details
Details for the file mllm-0.1.31-py3-none-any.whl
.
File metadata
- Download URL: mllm-0.1.31-py3-none-any.whl
- Upload date:
- Size: 10.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cdf9af247a83600829f6997c12f9835456a88d8f35c98c336ec11a8a08732922 |
|
MD5 | 4eddd262926c4c4cd150f02fee7dacf6 |
|
BLAKE2b-256 | a027e88ce6f8ff88bb8d1008e225087faa3fed9958a9414012dcfbed9ef11063 |