Skip to main content

Multimodal Large Language Models

Project description


MLLM

Multimodal Large Language Models
Explore the docs »

View Demo · Report Bug · Request Feature


Installation

pip install mllm

Usage

Create an MLLM router with a list of preferred models

import os
from mllm import Router

os.environ["OPENAI_API_KEY"] = "..."
os.environ["ANTHROPIC_API_KEY"] = "..."
os.environ["GEMINI_API_KEY"] = "..."

router = Router(
    preference=["gpt-4-turbo", "anthropic/claude-3-opus-20240229", "gemini/gemini-pro-vision"]
)

Create a new role based chat thread

from mllm import RoleThread

thread = RoleThread(owner_id="dolores@agentsea.ai")
thread.post(role="user", msg="Describe the image", images=["data:image/jpeg;base64,..."])

Chat with the MLLM, store the prompt data in the namespace foo

response = router.chat(thread, namespace="foo")
thread.add_msg(response.msg)

Ask for a structured response

from pydantic import BaseModel

class Animal(BaseModel):
    species: str
    color: str

thread.post(
    role="user",
    msg=f"What animal is in this image? Please output as schema {Animal.model_json_schema()}"
    images=["data:image/jpeg;base64,..."]
)

response = router.chat(thread, namespace="animal", expect=Animal)
animal_parsed = response.parsed

assert type(animal_parsed) == Animal

Find a saved thread or a prompt

RoleThread.find(id="123")
Prompt.find(id="456)

To store a raw openai prompt

from mllm import Prompt, RoleThread

thread = RoleThread()

msg = {
    "role": "user",
    "content": [
        {
            "type": "text",
            "text": "Whats in this image?",
        },
        {
            "type": "image_url",
            "image_url": {"url": f"data:image/jpeg;base64,..."},
        }
    ]
}
role_message = RoleMessage.from_openai(msg)
thread.add_msg(role_message)

response = call_openai(thread.to_openai())
response_msg = RoleMessage.from_openai(response["choices"][0]["message"])

saved_prompt = Prompt(thread, response_msg, namespace="foo")

Backends

Thread and prompt storage can be backed by:

  • Sqlite
  • Postgresql

Sqlite will be used by default. To use postgres simply configure the env vars:

DB_TYPE=postgres
DB_NAME=mllm
DB_HOST=localhost
DB_USER=postgres
DB_PASS=abc123

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mllm-0.1.23.tar.gz (9.3 kB view details)

Uploaded Source

Built Distribution

mllm-0.1.23-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file mllm-0.1.23.tar.gz.

File metadata

  • Download URL: mllm-0.1.23.tar.gz
  • Upload date:
  • Size: 9.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0

File hashes

Hashes for mllm-0.1.23.tar.gz
Algorithm Hash digest
SHA256 c1032cb1d876751cb61c2bb1c8d3cb644bc4b64da9ae408e98b94ef0fd0e1cdb
MD5 d13efc696e3f96602be528199e95b57a
BLAKE2b-256 3a2007ef2d0f5c713c30d0c800d6baa864d9e2979643728cc8ef4fd6243eeded

See more details on using hashes here.

File details

Details for the file mllm-0.1.23-py3-none-any.whl.

File metadata

  • Download URL: mllm-0.1.23-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0

File hashes

Hashes for mllm-0.1.23-py3-none-any.whl
Algorithm Hash digest
SHA256 f290f69f67fe540b8636e8c7cc5270a3d343b208525ca148129e5da6e2d5a875
MD5 d3c5a11556c282c81f6cb6bbe071a477
BLAKE2b-256 93de17da3b69bb53262fd7270ae921997867300dc6bad53260e4b026e30557dd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page