Skip to main content

Multimodal Large Language Models

Project description


MLLM

Multimodal Large Language Models
Explore the docs »

View Demo · Report Bug · Request Feature


Installation

pip install mllm

Usage

Create an MLLM router with a list of preferred models

import os
from mllm import Router

os.environ["OPENAI_API_KEY"] = "..."
os.environ["ANTHROPIC_API_KEY"] = "..."
os.environ["GEMINI_API_KEY"] = "..."

router = Router(
    preference=["gpt-4-turbo", "anthropic/claude-3-opus-20240229", "gemini/gemini-pro-vision"]
)

Create a new role based chat thread

from mllm import RoleThread

thread = RoleThread(owner_id="dolores@agentsea.ai")
thread.post(role="user", msg="Describe the image", images=["data:image/jpeg;base64,..."])

Chat with the MLLM, store the prompt data in the namespace foo

response = router.chat(thread, namespace="foo")
thread.add_msg(response.msg)

Ask for a structured response

from pydantic import BaseModel

class Animal(BaseModel):
    species: str
    color: str

thread.post(
    role="user",
    msg=f"What animal is in this image? Please output as schema {Animal.model_json_schema()}"
    images=["data:image/jpeg;base64,..."]
)

response = router.chat(thread, namespace="animal", expect=Animal)
animal_parsed = response.parsed

assert type(animal_parsed) == Animal

Find a saved thread or a prompt

RoleThread.find(id="123")
Prompt.find(id="456)

To store a raw openai prompt

from mllm import Prompt, RoleThread

thread = RoleThread()

msg = {
    "role": "user",
    "content": [
        {
            "type": "text",
            "text": "Whats in this image?",
        },
        {
            "type": "image_url",
            "image_url": {"url": f"data:image/jpeg;base64,..."},
        }
    ]
}
role_message = RoleMessage.from_openai(msg)
thread.add_msg(role_message)

response = call_openai(thread.to_openai())
response_msg = RoleMessage.from_openai(response["choices"][0]["message"])

saved_prompt = Prompt(thread, response_msg, namespace="foo")

Backends

Thread and prompt storage can be backed by:

  • Sqlite
  • Postgresql

Sqlite will be used by default. To use postgres simply configure the env vars:

DB_TYPE=postgres
DB_NAME=mllm
DB_HOST=localhost
DB_USER=postgres
DB_PASS=abc123

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mllm-0.1.25.tar.gz (9.3 kB view details)

Uploaded Source

Built Distribution

mllm-0.1.25-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file mllm-0.1.25.tar.gz.

File metadata

  • Download URL: mllm-0.1.25.tar.gz
  • Upload date:
  • Size: 9.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0

File hashes

Hashes for mllm-0.1.25.tar.gz
Algorithm Hash digest
SHA256 9487780685329d05a277bef50118e5e530b493e32e71912653f042c5f10e0e57
MD5 d2f5c509321c43ef258c16cbb1e7fd57
BLAKE2b-256 d2e771b3f3b7c3ec1aa7744ee608f862fcb526b2f3635144dd667af712de9308

See more details on using hashes here.

File details

Details for the file mllm-0.1.25-py3-none-any.whl.

File metadata

  • Download URL: mllm-0.1.25-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0

File hashes

Hashes for mllm-0.1.25-py3-none-any.whl
Algorithm Hash digest
SHA256 b81487f7fb687be9097ea14cd02b9307bdcc3aec494a722a2fa21644892391d8
MD5 b669af2efefe75ae8cb3018922756ee9
BLAKE2b-256 08a1c7edb637c87592e3e00f0c12a626d943e638d32fb94da7081b2cf213c4cb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page