Skip to main content

A helper for using the Mistral AI API

Project description

mistinguette

Install

pip install mistinguette

Getting started

OpenAI’s Python SDK will automatically be installed with Mistinguette, if you don’t already have it.

from mistinguette import *

Mistinguette only exports the symbols that are needed to use the library, so you can use import * to import them. Alternatively, just use:

import mistinguette

…and then add the prefix mistinguette. to any usages of the module.

models
['codestral-2501',
 'mistral-large-2411',
 'pixtral-large-2411',
 'mistral-saba-2502',
 'ministral-3b-2410',
 'ministral-8b-2410',
 'mistral-embed-2312',
 'mistral-moderation-2411',
 'mistral-ocr-2503',
 'mistral-small-2503',
 'open-mistral-nemo-2407']

For these examples, we’ll use mistral-large-2411.

model = models[1]

Chat

The main interface to Mistinguette is the Chat class, which provides a stateful interface to the models:

chat = Chat(model, sp="""You are a helpful and concise assistant.""")
chat("I'm Jeremy")

Hello Jeremy! Nice to meet you. How can I assist you today?

  • id: 401b34940c76413b93fde428839d2f28
  • object: chat.completion
  • model: mistral-large-2411
  • usage: prompt_tokens=18 completion_tokens=16 total_tokens=34
  • created: 1743589362
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘Hello Jeremy! Nice to meet you. How can I assist you today?’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]
r = chat("What's my name?")
r

Jeremy

  • id: 851227e58cbe4e6b97751897578ec3e5
  • object: chat.completion
  • model: mistral-large-2411
  • usage: prompt_tokens=50 completion_tokens=2 total_tokens=52
  • created: 1743589366
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘Jeremy’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]

As you see above, displaying the results of a call in a notebook shows just the message contents, with the other details hidden behind a collapsible section. Alternatively you can print the details:

print(r)
id='851227e58cbe4e6b97751897578ec3e5' object='chat.completion' model='mistral-large-2411' usage=In: 50; Out: 2; Total: 52 created=1743589366 choices=[ChatCompletionChoice(index=0, message=AssistantMessage(content='Jeremy', tool_calls=None, prefix=False, role='assistant'), finish_reason='stop')]

You can use stream=True to stream the results as soon as they arrive (although you will only see the gradual generation if you execute the notebook yourself, of course!)

for o in chat("What's your name?", stream=True): print(o, end='')
I don't have a name. I'm here to assist you. Is there something specific you would like help with?

Model Capabilities

Different Mistral AI models have different capabilities. For instance, mistral-large-2411 can not take an image as input as opposed to pixtral-large-2411:

# o1 does not support streaming or setting the temperature
m = "mistral-large-2411"
can_stream(m), can_set_system_prompt(m), can_set_temperature(m), can_use_image(m)
(True, True, True, False)

Tool use

Tool use lets the model use external tools.

We use docments to make defining Python functions as ergonomic as possible. Each parameter (and the return value) should have a type, and a docments comment with the description of what it is. As an example we’ll write a simple function that adds numbers together, and will tell us when it’s being called:

def sums(
    a:int,  # First thing to sum
    b:int=1 # Second thing to sum
) -> int: # The sum of the inputs
    "Adds a + b."
    print(f"Finding the sum of {a} and {b}")
    return a + b

Sometimes the model will say something like “according to the sums tool the answer is” – generally we’d rather it just tells the user the answer, so we can use a system prompt to help with this:

sp = "Never mention what tools you use."

We’ll get the model to add up some long numbers:

model
'mistral-large-2411'
a,b = 604542,6458932
pr = f"What is {a}+{b}?"
pr
'What is 604542+6458932?'

To use tools, pass a list of them to Chat:

chat = Chat(model, sp=sp, tools=[sums])

Now when we call that with our prompt, the model doesn’t return the answer, but instead returns a tool_use message, which means we have to call the named tool with the provided parameters:

r = chat(pr)
r
Finding the sum of 604542 and 6458932
  • id: add10a17e8494845a76595e45d765ba7
  • object: chat.completion
  • model: mistral-large-2411
  • usage: prompt_tokens=132 completion_tokens=37 total_tokens=169
  • created: 1743589453
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=’‘, tool_calls=[ToolCall(function=FunctionCall(name=’sums’, arguments=’{“a”: 604542, “b”: 6458932}’), id=’TLVUDh6Me’, type=None, index=0)], prefix=False, role=’assistant’), finish_reason=‘tool_calls’)]

Mistinguette handles all that for us – we just have to pass along the message, and it all happens automatically:

chat()

What is 604542+6458932? The answer is 7,063,474.

  • id: d4c05d5724fa433cb9020371edc15f97
  • object: chat.completion
  • model: mistral-large-2411
  • usage: prompt_tokens=197 completion_tokens=33 total_tokens=230
  • created: 1743589467
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘What is 604542+6458932? The answer is 7,063,474.’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]

You can see how many tokens have been used at any time by checking the use property.

chat.use
In: 329; Out: 70; Total: 399

Tool loop

We can do everything needed to use tools in a single step, by using Chat.toolloop. This can even call multiple tools as needed solve a problem. For example, let’s define a tool to handle multiplication:

def mults(
    a:int,  # First thing to multiply
    b:int=1 # Second thing to multiply
) -> int: # The product of the inputs
    "Multiplies a * b."
    print(f"Finding the product of {a} and {b}")
    return a * b

Now with a single call we can calculate (a+b)*2 – by passing show_trace we can see each response from the model in the process:

chat = Chat(model, sp=sp, tools=[sums,mults])
pr = f'Calculate ({a}+{b})*2'
pr
'Calculate (604542+6458932)*2'
def pchoice(r): print(r.choices[0])
r = chat.toolloop(pr, trace_func=pchoice)
Finding the sum of 604542 and 6458932
Finding the product of 2 and 7103474
Choice(finish_reason='tool_calls', index=0, logprobs=None, message=ChatCompletionMessage(content=None, refusal=None, role='assistant', audio=None, function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_Sfet73hgfRtSI2N25K97D7tO', function=Function(arguments='{"a": 604542, "b": 6458932}', name='sums'), type='function'), ChatCompletionMessageToolCall(id='call_mFQNJgjATAI2pYFFQyvfg0W2', function=Function(arguments='{"a": 2, "b": 7103474}', name='mults'), type='function')]))
Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='The result of \\((604542 + 6458932) \\times 2\\) is 14,206,948.', refusal=None, role='assistant', audio=None, function_call=None, tool_calls=None))

OpenAI uses special tags for math equations, which we can replace using wrap_latex:

wrap_latex(contents(r))

The result of $(604542 + 6458932) \times 2$ is 14,206,948.

Images

As everyone knows, when testing image APIs you have to use a cute puppy.

fn = Path('samples/puppy.jpg')
display.Image(filename=fn, width=200)

We create a Chat object as before:

model = "pixtral-large-2411"
chat = Chat(model)

Mistinguette expects images as a list of bytes, so we read in the file:

img = fn.read_bytes()

Prompts to Claudia can be lists, containing text, images, or both, eg:

chat([img, "In brief, what color flowers are in this image?"])

The flowers in the image are purple. These purple flowers seem to be daisy-like, and they are growing near the puppy, which is lying on the grass.

  • id: 110860448b5f443588a5288fbe72a807
  • object: chat.completion
  • model: pixtral-large-2411
  • usage: prompt_tokens=274 completion_tokens=36 total_tokens=310
  • created: 1743589622
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘The flowers in the image are purple. These purple flowers seem to be daisy-like, and they are growing near the puppy, which is lying on the grass.’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]

The image is included as input tokens.

chat.use
In: 274; Out: 36; Total: 310

Alternatively, Mistinguette supports creating a multi-stage chat with separate image and text prompts. For instance, you can pass just the image as the initial prompt (in which case the model will make some general comments about what it sees), and then follow up with questions in additional prompts:

chat = Chat(model)
chat(img)

This is an adorable puppy! It looks like a Cavalier King Charles Spaniel, known for its friendly and affectionate nature. These dogs are great companions and enjoy being around people. They have a gentle temperament and are well-suited to indoor living. If you have any specific questions about this breed or puppies in general, feel free to ask!

  • id: 47c88a7e27004d3c832a5dc8277f30a4
  • object: chat.completion
  • model: pixtral-large-2411
  • usage: prompt_tokens=263 completion_tokens=76 total_tokens=339
  • created: 1743589646
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘This is an adorable puppy! It looks like a Cavalier King Charles Spaniel, known for its friendly and affectionate nature. These dogs are great companions and enjoy being around people. They have a gentle temperament and are well-suited to indoor living. If you have any specific questions about this breed or puppies in general, feel free to ask!’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]
chat('What direction is the puppy facing?')

The puppy is facing the camera, looking directly at it. This means it is facing forward from the perspective of the viewer.

  • id: 118c686b1af64f42b04ff506a7e3f694
  • object: chat.completion
  • model: pixtral-large-2411
  • usage: prompt_tokens=350 completion_tokens=27 total_tokens=377
  • created: 1743589654
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘The puppy is facing the camera, looking directly at it. This means it is facing forward from the perspective of the viewer.’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]
chat('What color is it?')

The puppy is a Cavalier King Charles Spaniel with a Blenheim coloring. This means it has a white coat with chestnut or reddish-brown markings, particularly on the ears and parts of the face. The spot on the top of its head is a characteristic feature of the Blenheim color pattern.

  • id: 9e7c3903c05740a2845e8dca2307428c
  • object: chat.completion
  • model: pixtral-large-2411
  • usage: prompt_tokens=385 completion_tokens=69 total_tokens=454
  • created: 1743589657
  • choices: [ChatCompletionChoice(index=0, message=AssistantMessage(content=‘The puppy is a Cavalier King Charles Spaniel with a Blenheim coloring. This means it has a white coat with chestnut or reddish-brown markings, particularly on the ears and parts of the face. The spot on the top of its head is a characteristic feature of the Blenheim color pattern.’, tool_calls=None, prefix=False, role=‘assistant’), finish_reason=‘stop’)]

Note that the image is passed in again for every input in the dialog, so that number of input tokens increases quickly with this kind of chat.

chat.use
In: 998; Out: 172; Total: 1170

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mistinguette-0.0.2.tar.gz (20.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mistinguette-0.0.2-py3-none-any.whl (15.7 kB view details)

Uploaded Python 3

File details

Details for the file mistinguette-0.0.2.tar.gz.

File metadata

  • Download URL: mistinguette-0.0.2.tar.gz
  • Upload date:
  • Size: 20.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for mistinguette-0.0.2.tar.gz
Algorithm Hash digest
SHA256 4a5d54f12dbaee5cd9a5982de5023bd37f04970358df70716bfc88956f0596d4
MD5 c10ee529fed9fcf0dcf20266e5d2718f
BLAKE2b-256 58a65b26c2204975266af8438e6cb05c00851ef0b0a2581421401ecee93ecdd5

See more details on using hashes here.

File details

Details for the file mistinguette-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: mistinguette-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 15.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for mistinguette-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c3ff757f0bb5aaa21f1d0f366fd5c10c75ef2f17a8619489513b325253c29116
MD5 96a42df85ad4a3b51f1f76b0d725b2ce
BLAKE2b-256 7571ddc22eebd846395540cb8a1b6b91624c5dc1d9547a51242083933b7681bf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page