Skip to main content

msglm makes it a little easier to create messages for language models like Claude and OpenAI GPTs.

Project description

msglm

Installation

Install the latest version from pypi

$ pip install msglm

Usage

To use an LLM we need to structure our messages in a particular format.

Here’s an example of a text chat from the OpenAI docs.

from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
  model="gpt-4o",
  messages=[
    {"role": "user", "content": "What's the Wild Atlantic Way?"}
  ]
)

Generating the correct format for a particular API can get tedious. The goal of msglm is to make it easier.

The examples below will show you how to use msglm for text and image chats with OpenAI and Anthropic.

Text Chats

For a text chat simply pass a list of strings and the api format (e.g. “openai”) to mk_msgs and it will generate the correct format.

mk_msgs(["Hello, world!", "some assistant response"], api="openai")
[
    {"role": "user", "content": "Hello, world!"},
    {"role": "assistant", "content": "Some assistant response"}
]

anthropic

from msglm import mk_msgs_anthropic as mk_msgs
from anthropic import Anthropic
client = Anthropic()

r = client.messages.create(
    model="claude-3-haiku-20240307",
    max_tokens=1024,
    messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.content[0].text)

openai

from msglm import mk_msgs_openai as mk_msgs
from openai import OpenAI

client = OpenAI()
r = client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.choices[0].message.content)

Image Chats

For an image chat simply pass the raw image bytes in a list with your question to mk_msgs and it will generate the correct format.

mk_msg([img, "What's in this image?"], api="anthropic")
[
    {
        "role": "user", 
        "content": [
            {"type": "image", "source": {"type": "base64", "media_type": media_type, "data": img}}
            {"type": "text", "text": "What's in this image?"}
        ]
    }
]

anthropic

import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic()

img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content

r = client.messages.create(
    model="claude-3-haiku-20240307",
    max_tokens=1024,
    messages=[mk_msg([img, "Describe the image"])]
)
print(r.content[0].text)

openai

import httpx
from msglm import mk_msg_openai as mk_msg
from openai import OpenAI

img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content

client = OpenAI()
r = client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[mk_msg([img, "Describe the image"])]
)
print(r.choices[0].message.content)

API Wrappers

To make life a little easier, msglm comes with api specific wrappers for mk_msg and mk_msgs.

For Anthropic use

from msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs

For OpenAI use

from msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs

Other use-cases

Prompt Caching

msglm supports prompt caching for Anthropic models. Simply pass cache=True to mk_msg or mk_msgs.

from msglm import mk_msg_anthropic as mk_msg

mk_msg("please cache my message", cache=True)

This generates the expected cache block below

{
    "role": "user",
    "content": [
        {"type": "text", "text": "Please cache my message", "cache_control": {"type": "ephemeral"}}
    ]
}

PDF chats

msglm offers PDF support for Anthropic. Just like an image chat all you need to do is pass the raw pdf bytes in a list with your question to mk_msg and it will generate the correct format as shown in the example below.

import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})

url = "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"
pdf = httpx.get(url).content

r = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[mk_msg([pdf, "Which model has the highest human preference win rates across each use-case?"])]
)
print(r.content[0].text)

Note: this feature is currently in beta so you’ll need to:

  • use the Anthropic beta client (e.g. anthropic.Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'}))
  • use the claude-3-5-sonnet-20241022 model

Citations

msglm supports Anthropic citations. All you need to do is pass the content of your document to mk_ant_doc and then pass the output to mk_msg along with your question as shown in the example below.

from msglm import mk_ant_doc, mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic()

doc = mk_ant_doc("The grass is green. The sky is blue.", title="My Document")

r = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[mk_msg([doc, "What color is the grass and sky?"])]
)
for o in r.content:
    if c:=getattr(o, 'citations', None): print(f"{o.text}. source: {c[0]['cited_text']} from  {c[0]['document_title']}")
    else: print(o.text)

Note: The citations feature is currently available on Claude 3.5 Sonnet (new) and 3.5 Haiku.

Summary

We hope msglm will make your life a little easier when chatting to LLMs. To learn more about the package please read this doc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

msglm-0.0.7.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

msglm-0.0.7-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file msglm-0.0.7.tar.gz.

File metadata

  • Download URL: msglm-0.0.7.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for msglm-0.0.7.tar.gz
Algorithm Hash digest
SHA256 65609e384230356ffd72d67a9375fc5177e9dc3e76e1e3babb81fb4599bb00dc
MD5 e871f84f73c4925915cda4cc398937d7
BLAKE2b-256 2120dae27ebc9ec172899f5400b0353ea7f541975e09c4009c4701f6f1f97b19

See more details on using hashes here.

File details

Details for the file msglm-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: msglm-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 10.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for msglm-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 48a7594d05080fa634526914b8a3fcbd18b9aebdc2f53dd4a6e5851a73845951
MD5 4d22506072d187cc81a182cbe88c2e25
BLAKE2b-256 3da37c801bc76a4535d905a6025f594a0dd4d48c37160303f92e9bb46583ae9a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page