Skip to main content

llama-index llms mistral ai integration

Project description

LlamaIndex Llms Integration: Mistral

Installation

Install the required packages using the following commands:

%pip install llama-index-llms-mistralai
!pip install llama-index

Basic Usage

Initialize the MistralAI Model

To use the MistralAI model, create an instance and provide your API key:

from llama_index.llms.mistralai import MistralAI

llm = MistralAI(api_key="<replace-with-your-key>")

Generate Completions

To generate a text completion for a prompt, use the complete method:

resp = llm.complete("Paul Graham is ")
print(resp)

Chat with the Model

You can also chat with the model using a list of messages. Here’s an example:

from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = MistralAI().chat(messages)
print(resp)

Using Random Seed

To set a random seed for reproducibility, initialize the model with the random_seed parameter:

resp = MistralAI(random_seed=42).chat(messages)
print(resp)

Streaming Responses

Stream Completions

You can stream responses using the stream_complete method:

resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Stream Chat Responses

To stream chat messages, use the following code:

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")

Configure Model

To use a specific model configuration, initialize the model with the desired model name:

llm = MistralAI(model="mistral-medium")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Mistral Azure SDK Usage

To use the Mistral Azure SDK implementation, pass the Azure endpoint and API key. When these are provided, the client automatically uses the Mistral Azure SDK instead of the public Mistral endpoint.

from llama_index.llms.mistralai import MistralAI

llm = MistralAI(
    azure_endpoint="https://<your-resource-name>.openai.azure.com",
    azure_api_key="<replace-with-your-azure-key>",
    model="mistral-large-latest",
)

resp = llm.complete("Paul Graham is ")
print(resp)

Function Calling

You can call functions from the model by defining tools. Here’s an example:

from llama_index.llms.mistralai import MistralAI
from llama_index.core.tools import FunctionTool


def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b


def mystery(a: int, b: int) -> int:
    """Mystery function on two integers."""
    return a * b + a + b


mystery_tool = FunctionTool.from_defaults(fn=mystery)
multiply_tool = FunctionTool.from_defaults(fn=multiply)

llm = MistralAI(model="mistral-large-latest")
response = llm.predict_and_call(
    [mystery_tool, multiply_tool],
    user_msg="What happens if I run the mystery function on 5 and 7",
)
print(str(response))

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/mistralai/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_mistralai-0.10.2.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_mistralai-0.10.2-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_mistralai-0.10.2.tar.gz.

File metadata

  • Download URL: llama_index_llms_mistralai-0.10.2.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_mistralai-0.10.2.tar.gz
Algorithm Hash digest
SHA256 6bcf466aab18d6aa04d7def38d670f8517d4aa0ff393aff27e65fe578007945a
MD5 816b0d402cd922bebfa0b6aa4e72c2d3
BLAKE2b-256 df805aa8a1f4b20374915d6f4ee20a109de4947496bd33339470de25f0979a3b

See more details on using hashes here.

File details

Details for the file llama_index_llms_mistralai-0.10.2-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_mistralai-0.10.2-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_mistralai-0.10.2-py3-none-any.whl
Algorithm Hash digest
SHA256 122922a7e9d6f3c4cf4bdfcdfd816f2ab2b9a2af1581d3658ea00dcde716ec49
MD5 09216e5e1207cb3d14eebd5c925f4571
BLAKE2b-256 e5248c9b4813ae6c70a8c0dc2dea4fed8b2c3c817f8aec5deb2d5bd583f366b1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page