Skip to main content

llama-index llms mistral ai integration

Project description

LlamaIndex Llms Integration: Mistral

Installation

Install the required packages using the following commands:

%pip install llama-index-llms-mistralai
!pip install llama-index

Basic Usage

Initialize the MistralAI Model

To use the MistralAI model, create an instance and provide your API key:

from llama_index.llms.mistralai import MistralAI

llm = MistralAI(api_key="<replace-with-your-key>")

Generate Completions

To generate a text completion for a prompt, use the complete method:

resp = llm.complete("Paul Graham is ")
print(resp)

Chat with the Model

You can also chat with the model using a list of messages. Here’s an example:

from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = MistralAI().chat(messages)
print(resp)

Using Random Seed

To set a random seed for reproducibility, initialize the model with the random_seed parameter:

resp = MistralAI(random_seed=42).chat(messages)
print(resp)

Streaming Responses

Stream Completions

You can stream responses using the stream_complete method:

resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Stream Chat Responses

To stream chat messages, use the following code:

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")

Configure Model

To use a specific model configuration, initialize the model with the desired model name:

llm = MistralAI(model="mistral-medium")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Mistral Azure SDK Usage

To use the Mistral Azure SDK implementation, pass the Azure endpoint and API key. When these are provided, the client automatically uses the Mistral Azure SDK instead of the public Mistral endpoint.

from llama_index.llms.mistralai import MistralAI

llm = MistralAI(
    azure_endpoint="https://<your-resource-name>.openai.azure.com",
    azure_api_key="<replace-with-your-azure-key>",
    model="mistral-large-latest",
)

resp = llm.complete("Paul Graham is ")
print(resp)

Function Calling

You can call functions from the model by defining tools. Here’s an example:

from llama_index.llms.mistralai import MistralAI
from llama_index.core.tools import FunctionTool


def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b


def mystery(a: int, b: int) -> int:
    """Mystery function on two integers."""
    return a * b + a + b


mystery_tool = FunctionTool.from_defaults(fn=mystery)
multiply_tool = FunctionTool.from_defaults(fn=multiply)

llm = MistralAI(model="mistral-large-latest")
response = llm.predict_and_call(
    [mystery_tool, multiply_tool],
    user_msg="What happens if I run the mystery function on 5 and 7",
)
print(str(response))

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/mistralai/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_mistralai-0.10.0.post2.tar.gz (9.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_llms_mistralai-0.10.0.post2.tar.gz.

File metadata

  • Download URL: llama_index_llms_mistralai-0.10.0.post2.tar.gz
  • Upload date:
  • Size: 9.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_mistralai-0.10.0.post2.tar.gz
Algorithm Hash digest
SHA256 519ebfb8b479dfe9b21b98e52563e515e4c6e3895ae1d04e8778a53e8976f449
MD5 140a38493d40af6826f1ccbbe8cdd215
BLAKE2b-256 036d8e8543abab67faa3c3972a01e82175e783a8a3a61bcafed27dd4bcd96aee

See more details on using hashes here.

File details

Details for the file llama_index_llms_mistralai-0.10.0.post2-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_mistralai-0.10.0.post2-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_mistralai-0.10.0.post2-py3-none-any.whl
Algorithm Hash digest
SHA256 bdae54e3b2f696c1bfd8b440aa3a33134478ab57fc3fba451c47771a93e9dff6
MD5 2c4889fa1f324afcd4764f3b86a45544
BLAKE2b-256 6c5f04ecd62272a10392bdad1000e46d4208543d2f3c6c2156f10fba743af2d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page