Skip to main content

LLM plugin providing access to Mistral models using the Mistral API

Project description

llm-mistral

PyPI Changelog Tests License

LLM plugin providing access to Mistral models using the Mistral API

Installation

Install this plugin in the same environment as LLM:

llm install llm-mistral

Usage

First, obtain an API key for the Mistral API.

Configure the key using the llm keys set mistral command:

llm keys set mistral
<paste key here>

You can now access the Mistral hosted models. Run llm models for a list.

To run a prompt through mistral-tiny:

llm -m mistral-tiny 'A sassy name for a pet sasquatch'

To start an interactive chat session with mistral-small:

llm chat -m mistral-small
Chatting with mistral-small
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> three proud names for a pet walrus
1. "Nanuq," the Inuit word for walrus, which symbolizes strength and resilience.
2. "Sir Tuskalot," a playful and regal name that highlights the walrus' distinctive tusks.
3. "Glacier," a name that reflects the walrus' icy Arctic habitat and majestic presence.

To use a system prompt with mistral-medium to explain some code:

cat example.py | llm -m mistral-medium -s 'explain this code'

Vision

The Pixtral models are capable of interpreting images. You can use those like this:

llm -m pixtral-large 'describe this image' \
  -a https://static.simonwillison.net/static/2025/two-pelicans.jpg

Output:

This image features two pelicans in flight against a clear blue sky. Pelicans are large water birds known for their long beaks and distinctive throat pouches, which they use for catching fish. In this photo, the birds are flying close to each other, showcasing their expansive wings and characteristic beaks. The clear sky provides a stark contrast, highlighting the details of their feathers and the graceful curves of their wings. The image captures a moment of synchronicity and elegance in nature.

You can pass filenames instead of URLs.

Audio

The Voxtral models - voxtral-small and voxtral-mini - are capable of accepting audio input. This currently only works for URLs to MP3 files hosted online:

llm -m voxtral-small \
  -a https://static.simonwillison.net/static/2024/pelican-joke-request.mp3

Output:

What do you call a pelican with no teeth? A gum-ican

Tools

To see a list of Mistral models that support tools (most of them) run:

llm models --tools -q mistral

Try one out like this:

llm -m mistral-medium -T llm_time 'What time is it?' --td

Schemas

Mistral models (with the exception of codestral-mamba) also support schemas:

llm -m mistral-small --schema 'name,bio:one sentence' 'invent a cool dog'

Output:

{
  "name": "CyberHound",
  "bio": "A futuristic dog with glowing cybernetic enhancements and the ability to hack into any system."
}

Model options

All three models accept the following options, using -o name value syntax:

  • -o temperature 0.7: The sampling temperature, between 0 and 1. Higher increases randomness, lower values are more focused and deterministic.
  • -o top_p 0.1: 0.1 means consider only tokens in the top 10% probability mass. Use this or temperature but not both.
  • -o max_tokens 20: Maximum number of tokens to generate in the completion.
  • -o safe_mode 1: Turns on safe mode, which adds a system prompt to add guardrails to the model output.
  • -o random_seed 123: Set an integer random seed to generate deterministic results.
  • -o prefix 'Prefix here: Set a prefix that will be used for the start of the response. Try { to encourage JSON or GlaDOS: to encourage a roleplay from a specific character.

Available models

Run llm models for a full list of Mistral models. This plugin configures the following alias shortcuts:

  • mistral-tiny for mistral/mistral-tiny
  • mistral-nemo for mistral/open-mistral-nemo
  • mistral-small-2312 for mistral/mistral-small-2312
  • mistral-small-2402 for mistral/mistral-small-2402
  • mistral-small-2409 for mistral/mistral-small-2409
  • mistral-small-2501 for mistral/mistral-small-2501
  • magistral-small-2506 for mistral/magistral-small-2506
  • magistral-small for mistral/magistral-small-latest
  • mistral-small for mistral/mistral-small-latest
  • mistral-medium-2312 for mistral/mistral-medium-2312
  • mistral-medium-2505 for mistral/mistral-medium-2505
  • magistral-medium-2506 for mistral/magistral-medium-2506
  • magistral-medium for mistral/magistral-medium-latest
  • mistral-medium for mistral/mistral-medium-latest
  • mistral-large for mistral/mistral-large-latest
  • codestral-mamba for mistral/codestral-mamba-latest
  • codestral for mistral/codestral-latest
  • ministral-3b for mistral/ministral-3b-latest
  • ministral-8b for mistral/ministral-8b-latest
  • pixtral-12b for mistral/pixtral-12b-latest
  • pixtral-large for mistral/pixtral-large-latest
  • devstral-small for mistral/devstral-small-latest
  • voxtral-mini for mistral/voxtral-mini-2507
  • voxtral-small for mistral/voxtral-small-2507

Refreshing the model list

Mistral sometimes release new models.

To make those models available to an existing installation of llm-mistral run this command:

llm mistral refresh

This will fetch and cache the latest list of available models. They should then become available in the output of the llm models command.

Embeddings

The Mistral Embeddings API can be used to generate 1,024 dimensional embeddings for any text.

To embed a single string:

llm embed -m mistral-embed -c 'this is text'

This will return a JSON array of 1,024 floating point numbers.

Mistral's Codestral Embed is an embedding model that specializes in code. LLM supports that in four different sizes:

llm embed -m mistral/codestral-embed-256 -c 'code...'
llm embed -m mistral/codestral-embed-512 -c 'code...'
llm embed -m mistral/codestral-embed-1024 -c 'code...'
llm embed -m mistral/codestral-embed-1536 -c 'code...'
llm embed -m mistral/codestral-embed-3072 -c 'code...'

The number is the size of the vector that will be returned.

You can also use codestral-embed which is an alias for the default size, codestral-embed-1536.

The [LLM documentation](https://llm.datasette.io/en/stable/embeddings/index.html) has more, including how to embed in bulk and store the results in a SQLite database.

See [LLM now provides tools for working with embeddings](https://simonwillison.net/2023/Sep/4/llm-embeddings/) and [Embeddings: What they are and why they matter](https://simonwillison.net/2023/Oct/23/embeddings/) for more about embeddings.

## Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:
```bash
cd llm-mistral
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_mistral-0.15.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_mistral-0.15-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file llm_mistral-0.15.tar.gz.

File metadata

  • Download URL: llm_mistral-0.15.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llm_mistral-0.15.tar.gz
Algorithm Hash digest
SHA256 dcdb5c67fbf45f3531bdbb8b5143c890a519626b7a05ee27a539e8bf71a6c066
MD5 2b8c71589bc59d4f2cd41d0db60e4d27
BLAKE2b-256 da328bc79ef5ee6fab99fc96c80592c7c4519819964f21d79bfc8dadb7abd9f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_mistral-0.15.tar.gz:

Publisher: publish.yml on simonw/llm-mistral

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_mistral-0.15-py3-none-any.whl.

File metadata

  • Download URL: llm_mistral-0.15-py3-none-any.whl
  • Upload date:
  • Size: 13.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llm_mistral-0.15-py3-none-any.whl
Algorithm Hash digest
SHA256 3dfd15db761c040da49fa1675fc2581be95f16ca7757601a8ca3a245896ba5b4
MD5 06a8c30d20a3c59b6c144899b26b998c
BLAKE2b-256 458e58c3b8bdf1539d8689c8dd5a181b7dbbb78a9d62d6aa0e8bc959d59d0a24

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_mistral-0.15-py3-none-any.whl:

Publisher: publish.yml on simonw/llm-mistral

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page