Skip to main content

msglm makes it a little easier to create messages for language models like Claude and OpenAI GPTs.

Project description

msglm

Installation

Install the latest version from pypi

$ pip install msglm

Usage

To use an LLM we need to structure our messages in a particular format.

Here’s an example of a text chat from the OpenAI docs.

from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
  model="gpt-4o",
  messages=[
    {"role": "user", "content": "What's the Wild Atlantic Way?"}
  ]
)

Generating the correct format for a particular API can get tedious. The goal of msglm is to make it easier.

The examples below will show you how to use msglm for text and image chats with OpenAI and Anthropic.

Text Chats

For a text chat simply pass a list of strings and the api format (e.g. “openai”) to mk_msgs and it will generate the correct format.

mk_msgs(["Hello, world!", "some assistant response"], api="openai")
[
    {"role": "user", "content": "Hello, world!"},
    {"role": "assistant", "content": "Some assistant response"}
]

anthropic

from msglm import mk_msgs_anthropic as mk_msgs
from anthropic import Anthropic
client = Anthropic()

r = client.messages.create(
    model="claude-3-haiku-20240307",
    max_tokens=1024,
    messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.content[0].text)

openai

from msglm import mk_msgs_openai as mk_msgs
from openai import OpenAI

client = OpenAI()
r = client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.choices[0].message.content)

Image Chats

For an image chat simply pass the raw image bytes in a list with your question to mk_msgs and it will generate the correct format.

mk_msg([img, "What's in this image?"], api="anthropic")
[
    {
        "role": "user", 
        "content": [
            {"type": "image", "source": {"type": "base64", "media_type": media_type, "data": img}}
            {"type": "text", "text": "What's in this image?"}
        ]
    }
]

anthropic

import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic()

img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content

r = client.messages.create(
    model="claude-3-haiku-20240307",
    max_tokens=1024,
    messages=[mk_msg([img, "Describe the image"])]
)
print(r.content[0].text)

openai

import httpx
from msglm import mk_msg_openai as mk_msg
from openai import OpenAI

img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content

client = OpenAI()
r = client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[mk_msg([img, "Describe the image"])]
)
print(r.choices[0].message.content)

API Wrappers

To make life a little easier, msglm comes with api specific wrappers for mk_msg and mk_msgs.

For Anthropic use

from msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs

For OpenAI use

from msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs

Other use-cases

Prompt Caching

msglm supports prompt caching for Anthropic models. Simply pass cache=True to mk_msg or mk_msgs.

from msglm import mk_msg_anthropic as mk_msg

mk_msg("please cache my message", cache=True)

This generates the expected cache block below

{
    "role": "user",
    "content": [
        {"type": "text", "text": "Please cache my message", "cache_control": {"type": "ephemeral"}}
    ]
}

PDF chats

msglm offers PDF support for Anthropic. Just like an image chat all you need to do is pass the raw pdf bytes in a list with your question to mk_msg and it will generate the correct format as shown in the example below.

import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})

url = "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"
pdf = httpx.get(url).content

r = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[mk_msg([pdf, "Which model has the highest human preference win rates across each use-case?"])]
)
print(r.content[0].text)

Note: this feature is currently in beta so you’ll need to:

  • use the Anthropic beta client (e.g. anthropic.Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'}))
  • use the claude-3-5-sonnet-20241022 model

Citations

msglm supports Anthropic citations. All you need to do is pass the content of your document to mk_ant_doc and then pass the output to mk_msg along with your question as shown in the example below.

from msglm import mk_ant_doc, mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic()

doc = mk_ant_doc("The grass is green. The sky is blue.", title="My Document")

r = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[mk_msg([doc, "What color is the grass and sky?"])]
)
for o in r.content:
    if c:=getattr(o, 'citations', None): print(f"{o.text}. source: {c[0]['cited_text']} from  {c[0]['document_title']}")
    else: print(o.text)

Note: The citations feature is currently available on Claude 3.5 Sonnet (new) and 3.5 Haiku.

Summary

We hope msglm will make your life a little easier when chatting to LLMs. To learn more about the package please read this doc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

msglm-0.0.8.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

msglm-0.0.8-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file msglm-0.0.8.tar.gz.

File metadata

  • Download URL: msglm-0.0.8.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for msglm-0.0.8.tar.gz
Algorithm Hash digest
SHA256 6e1b7cc719f067b0a065df5d4519d1f0d81b8019b9254c61452d593c7714ff1b
MD5 bd66b880e5d9a86c68e1979db6c090a2
BLAKE2b-256 a63b19106f37c43014bb255b914654851fdb4dcdc52964a25f3544d0a8e1d824

See more details on using hashes here.

File details

Details for the file msglm-0.0.8-py3-none-any.whl.

File metadata

  • Download URL: msglm-0.0.8-py3-none-any.whl
  • Upload date:
  • Size: 10.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for msglm-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 11c760fa91ac89187f1d3f71eaf17f961bf7f351051d273863a83828440212d5
MD5 dcf76c512d5268a4e0f529b9676c0d89
BLAKE2b-256 8c75c1da6fccca25d0c07dbf91f43fb2afc86a8c55bf34eba24c2b77d43f6cd3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page