Skip to main content

llama-index llms mistral ai integration

Project description

LlamaIndex Llms Integration: Mistral

Installation

Install the required packages using the following commands:

%pip install llama-index-llms-mistralai
!pip install llama-index

Basic Usage

Initialize the MistralAI Model

To use the MistralAI model, create an instance and provide your API key:

from llama_index.llms.mistralai import MistralAI

llm = MistralAI(api_key="<replace-with-your-key>")

Generate Completions

To generate a text completion for a prompt, use the complete method:

resp = llm.complete("Paul Graham is ")
print(resp)

Chat with the Model

You can also chat with the model using a list of messages. Here’s an example:

from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = MistralAI().chat(messages)
print(resp)

Using Random Seed

To set a random seed for reproducibility, initialize the model with the random_seed parameter:

resp = MistralAI(random_seed=42).chat(messages)
print(resp)

Streaming Responses

Stream Completions

You can stream responses using the stream_complete method:

resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Stream Chat Responses

To stream chat messages, use the following code:

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")

Configure Model

To use a specific model configuration, initialize the model with the desired model name:

llm = MistralAI(model="mistral-medium")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Mistral Azure SDK Usage

To use the Mistral Azure SDK implementation, pass the Azure endpoint and API key. When these are provided, the client automatically uses the Mistral Azure SDK instead of the public Mistral endpoint.

from llama_index.llms.mistralai import MistralAI

llm = MistralAI(
    azure_endpoint="https://<your-resource-name>.openai.azure.com",
    azure_api_key="<replace-with-your-azure-key>",
    model="mistral-large-latest",
)

resp = llm.complete("Paul Graham is ")
print(resp)

Function Calling

You can call functions from the model by defining tools. Here’s an example:

from llama_index.llms.mistralai import MistralAI
from llama_index.core.tools import FunctionTool


def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b


def mystery(a: int, b: int) -> int:
    """Mystery function on two integers."""
    return a * b + a + b


mystery_tool = FunctionTool.from_defaults(fn=mystery)
multiply_tool = FunctionTool.from_defaults(fn=multiply)

llm = MistralAI(model="mistral-large-latest")
response = llm.predict_and_call(
    [mystery_tool, multiply_tool],
    user_msg="What happens if I run the mystery function on 5 and 7",
)
print(str(response))

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/mistralai/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_mistralai-0.10.1.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_mistralai-0.10.1-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_mistralai-0.10.1.tar.gz.

File metadata

  • Download URL: llama_index_llms_mistralai-0.10.1.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.4 {"installer":{"name":"uv","version":"0.11.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_mistralai-0.10.1.tar.gz
Algorithm Hash digest
SHA256 782c71b989eb5d8a1419649df176cce7634a155887dc7744311b6ca70868e2d9
MD5 dde6118e260c3d3b1c592ce8120496d2
BLAKE2b-256 5236c6d19b67567aa2355cfe1fb3d40463d5f6624bb7e8fbfb2cfafa9fd05d1e

See more details on using hashes here.

File details

Details for the file llama_index_llms_mistralai-0.10.1-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_mistralai-0.10.1-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.4 {"installer":{"name":"uv","version":"0.11.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_mistralai-0.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f1ccad9853a9ae10c9bc61a04b2f55ba15dea8ac1ab223a3d3f001dffd0b9595
MD5 fc789654575940be1cc609ca58bbeeec
BLAKE2b-256 6c3243472300ec60f2f0b8465a9a08aadf5d4b271f990104a2948d36ca587e92

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page