Skip to main content

Multimodal Large Language Models

Project description

MLLM

MultiModal Large Language Models

Installation

pip install mllm

Usage

Create an MLLM router from the API keys found in the current system env vars

import os
from mllm import MLLMRouter

os.environ["OPENAI_API_KEY"] = "..."
os.environ["ANTHROPIC_API_KEY"] = "..."
os.environ["GEMINI_API_KEY"] = "..."

router = MLLMRouter.from_env()

Create a new role based chat thread

from mllm import RoleThread

thread = RoleThread()
thread.post(role="user", msg="How are you?", images=["data:image/jpeg;base64,..."])

Chat with the MLLM, store the prompt data in the namespace foo

response = router.chat(thread, namespace="foo")
thread.add_msg(response.msg)

Ask for a structured response

from pydantic import BaseModel

class Foo(BaseModel):
    bar: str
    baz: int

thread.post(
    role="user",
    msg=f"What are bar and baz in this image? Please output as schema {Foo.model_json_schema()}"
    images=["data:image/jpeg;base64,..."]
)

response = router.chat(thread, namespace="foo", response_schema=Foo)
foo_parsed = response.parsed

assert type(foo_parsed) == Foo

Find a saved thread or a prompt

RoleThread.find(id="123")
Prompt.find(id="456)

Just store prompts

from mllm import Prompt, RoleThread

thread = RoleThread()

msg = {
    "role": "user",
    "content": [
        {
            "type": "text",
            "text": "Whats in this image?",
        },
        {
            "type": "image_url",
            "image_url": {"url": f"data:image/jpeg;base64,..."},
        }
    ]
}
role_message = RoleMessage.from_openai(msg)
thread.add_msg(role_message)

response = call_openai(thread.to_openai())
response_msg = RoleMessage.from_openai(response["choices"][0]["message"])

saved_prompt = Prompt(thread, response_msg, namespace="foo")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mllm-0.1.2.tar.gz (7.9 kB view hashes)

Uploaded Source

Built Distribution

mllm-0.1.2-py3-none-any.whl (9.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page