Multimodal Large Language Models
Project description
MLLM
MultiModal Large Language Models
Installation
pip install mllm
Usage
Create an MLLM router from the API keys found in the current system env vars
import os
from mllm import MLLMRouter
os.environ["OPENAI_API_KEY"] = "..."
os.environ["ANTHROPIC_API_KEY"] = "..."
os.environ["GEMINI_API_KEY"] = "..."
router = MLLMRouter.from_env()
Create a new role based chat thread
from mllm import RoleThread
thread = RoleThread()
thread.post(role="user", msg="How are you?", images=["data:image/jpeg;base64,..."])
Chat with the MLLM, store the prompt data in the namespace foo
response = router.chat(thread, namespace="foo")
thread.add_msg(response.msg)
Ask for a structured response
from pydantic import BaseModel
class Foo(BaseModel):
bar: str
baz: int
thread.post(
role="user",
msg=f"What are bar and baz in this image? Please output as schema {Foo.model_json_schema()}"
images=["data:image/jpeg;base64,..."]
)
response = router.chat(thread, namespace="foo", response_schema=Foo)
foo_parsed = response.parsed
assert type(foo_parsed) == Foo
Find a saved thread or a prompt
RoleThread.find(id="123")
Prompt.find(id="456)
Just store prompts
from mllm import Prompt, RoleThread
thread = RoleThread()
msg = {
"role": "user",
"content": [
{
"type": "text",
"text": "Whats in this image?",
},
{
"type": "image_url",
"image_url": {"url": f"data:image/jpeg;base64,..."},
}
]
}
role_message = RoleMessage.from_openai(msg)
thread.add_msg(role_message)
response = call_openai(thread.to_openai())
response_msg = RoleMessage.from_openai(response["choices"][0]["message"])
saved_prompt = Prompt(thread, response_msg, namespace="foo")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mllm-0.1.2.tar.gz
(7.9 kB
view details)
Built Distribution
mllm-0.1.2-py3-none-any.whl
(9.3 kB
view details)
File details
Details for the file mllm-0.1.2.tar.gz
.
File metadata
- Download URL: mllm-0.1.2.tar.gz
- Upload date:
- Size: 7.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1eff2eb1c0088822c11a03ffb5c775e65f53fcc8cc9f22c304ea8a2fe3be1502 |
|
MD5 | 5d168ca301eef65b0d2c6278fdca1829 |
|
BLAKE2b-256 | ee5ba897c6e659428ac124909bcf5cc9d8c4d43e7be6aa648d9ec55248d3b615 |
File details
Details for the file mllm-0.1.2-py3-none-any.whl
.
File metadata
- Download URL: mllm-0.1.2-py3-none-any.whl
- Upload date:
- Size: 9.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8f3307ee7a93105239716a03bbd8eab7a242f6d408c3209a875e6503c40c19d8 |
|
MD5 | 0ef7da734bffe379a477b8ef11bcabfb |
|
BLAKE2b-256 | 1dbd371778a7a776b69ffb1b102f1a528887f11c3d5b47ac1add49aa29b34a34 |