Skip to main content

LLM plugin providing access to Mistral models busing the Mistral API

Project description

llm-mistral

PyPI Changelog Tests License

LLM plugin providing access to Mistral models using the Mistral API

Installation

Install this plugin in the same environment as LLM:

llm install llm-mistral

Usage

First, obtain an API key for the Mistral API.

Configure the key using the llm keys set mistral command:

llm keys set mistral
<paste key here>

You can now access the Mistral hosted models. Run llm models for a list.

To run a prompt through mistral-tiny:

llm -m mistral-tiny 'A sassy name for a pet sasquatch'

To start an interactive chat session with mistral-small:

llm chat -m mistral-small
Chatting with mistral-small
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> three proud names for a pet walrus
1. "Nanuq," the Inuit word for walrus, which symbolizes strength and resilience.
2. "Sir Tuskalot," a playful and regal name that highlights the walrus' distinctive tusks.
3. "Glacier," a name that reflects the walrus' icy Arctic habitat and majestic presence.

To use a system prompt with mistral-medium to explain some code:

cat example.py | llm -m mistral-medium -s 'explain this code'

Vision

The pixtral-12b model is capable of interpreting images. You can use that like this:

llm -m pixtral-12b 'describe this image' -a https://static.simonwillison.net/static/2024/earth.jpg

You can also pass filenames instead of URLs.

Model options

All three models accept the following options, using -o name value syntax:

  • -o temperature 0.7: The sampling temperature, between 0 and 1. Higher increases randomness, lower values are more focused and deterministic.
  • -o top_p 0.1: 0.1 means consider only tokens in the top 10% probability mass. Use this or temperature but not both.
  • -o max_tokens 20: Maximum number of tokens to generate in the completion.
  • -o safe_mode 1: Turns on safe mode, which adds a system prompt to add guardrails to the model output.
  • -o random_seed 123: Set an integer random seed to generate deterministic results.

Available models

Run llm models for a full list of Mistral models. This plugin configures the following alias shortcuts:

  • mistral-tiny for mistral/mistral-tiny
  • mistral-nemo for mistral/open-mistral-nemo
  • mistral-small for mistral/mistral-small
  • mistral-medium for mistral/mistral-medium
  • mistral-large for mistral/mistral-large-latest
  • codestral-mamba for mistral/codestral-mamba-latest
  • codestral for mistral/codestral-latest
  • ministral-3b for mistral/ministral-3b-latest
  • ministral-8b for mistral/ministral-8b-latest
  • pixtral-12b for mistral/pixtral-12b-latest
  • pixtral-large for mistral/pixtral-large-latest

Refreshing the model list

Mistral sometimes release new models.

To make those models available to an existing installation of llm-mistral run this command:

llm mistral refresh

This will fetch and cache the latest list of available models. They should then become available in the output of the llm models command.

Embeddings

The Mistral Embeddings API can be used to generate 1,024 dimensional embeddings for any text.

To embed a single string:

llm embed -m mistral-embed -c 'this is text'

This will return a JSON array of 1,024 floating point numbers.

The LLM documentation has more, including how to embed in bulk and store the results in a SQLite database.

See LLM now provides tools for working with embeddings and Embeddings: What they are and why they matter for more about embeddings.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-mistral
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_mistral-0.9a0.tar.gz (11.9 kB view details)

Uploaded Source

Built Distribution

llm_mistral-0.9a0-py3-none-any.whl (11.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_mistral-0.9a0.tar.gz.

File metadata

  • Download URL: llm_mistral-0.9a0.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for llm_mistral-0.9a0.tar.gz
Algorithm Hash digest
SHA256 ffd7889528b68c3ef737c5ae73792c3d3f7f3d3cc314015e37465fa688e93514
MD5 a543c88e2875ca4010533792ea2c78c1
BLAKE2b-256 a32ea7d853b00b34dd6928e246b6d66700f4db21bb4f6d4db6b89206070b0089

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_mistral-0.9a0.tar.gz:

Publisher: publish.yml on simonw/llm-mistral

Attestations:

File details

Details for the file llm_mistral-0.9a0-py3-none-any.whl.

File metadata

  • Download URL: llm_mistral-0.9a0-py3-none-any.whl
  • Upload date:
  • Size: 11.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for llm_mistral-0.9a0-py3-none-any.whl
Algorithm Hash digest
SHA256 e0615279897d9a4245316260faf30d0c58f31f91bc4ef4c3157b3fd73755ae79
MD5 72623af8e2f2e36a34f519e4643c11eb
BLAKE2b-256 f066deb145a3364517a3cb8a5a5b598a161242c6fa80062ed7d9543c4a0cf8a0

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_mistral-0.9a0-py3-none-any.whl:

Publisher: publish.yml on simonw/llm-mistral

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page