LLM plugin providing access to Mistral models busing the Mistral API
Project description
llm-mistral
LLM plugin providing access to Mistral models using the Mistral API
Installation
Install this plugin in the same environment as LLM:
llm install llm-mistral
Usage
First, obtain an API key for the Mistral API.
Configure the key using the llm keys set mistral
command:
llm keys set mistral
<paste key here>
You can now access the three Mistral hosted models: mistral-tiny
, mistral-small
and mistral-medium
.
To run a prompt through mistral-tiny
:
llm -m mistral-tiny 'A sassy name for a pet sasquatch'
To start an interactive chat session with mistral-small
:
llm chat -m mistral-small
Chatting with mistral-small
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> three proud names for a pet walrus
1. "Nanuq," the Inuit word for walrus, which symbolizes strength and resilience.
2. "Sir Tuskalot," a playful and regal name that highlights the walrus' distinctive tusks.
3. "Glacier," a name that reflects the walrus' icy Arctic habitat and majestic presence.
To use a system prompt with mistral-medium
to explain some code:
cat example.py | llm -m mistral-medium -s 'explain this code'
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-mistral
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for llm_mistral-0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 92d9dd7f2636daa39643594c95d0b82fc336024d5491eacfb35b31c2db744324 |
|
MD5 | ccf459bdfc2afa264af561cf0c9caaf9 |
|
BLAKE2b-256 | 680f24244887eb8e77d3c4b8e7c786f7559dc9a0016c598d2b43dba39ba08dbb |