OpenInference Mistral AI Instrumentation
Project description
OpenInference Mistral AI Instrumentation
Python autoinstrumentation library for MistralAI's Python SDK.
The traces emitted by this instrumentation are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as arize-phoenix
Installation
pip install openinference-instrumentation-mistralai
Quickstart
In this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces via arize-phoenix
.
Install packages.
pip install openinference-instrumentation-mistralai mistralai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
Start the phoenix server so that it is ready to collect traces. The Phoenix server runs entirely on your machine and does not send data over the internet.
python -m phoenix.server.main serve
In a python file, setup the MistralAIInstrumentor
and configure the tracer to send traces to Phoenix.
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from openinference.instrumentation.mistralai import MistralAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
# Optionally, you can also print the spans to the console.
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace_api.set_tracer_provider(tracer_provider)
MistralAIInstrumentor().instrument()
if __name__ == "__main__":
client = MistralClient()
response = client.chat(
model="mistral-large-latest",
messages=[
ChatMessage(
content="Who won the World Cup in 2018?",
role="user",
)
],
)
print(response.choices[0].message.content)
Since we are using MistralAI, we must set the MISTRAL_API_KEY
environment variable to authenticate with the MistralAI API.
export MISTRAL_API_KEY=[your_key_here]
Now simply run the python file and observe the traces in Phoenix.
python your_file.py
More Info
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file openinference_instrumentation_mistralai-1.1.0.tar.gz
.
File metadata
- Download URL: openinference_instrumentation_mistralai-1.1.0.tar.gz
- Upload date:
- Size: 21.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f35ac8c94571ef0c7514529c31256f521ea078cc815ac2875fa91f6ae88878bf |
|
MD5 | 1c9389e61c8d4061e0e7cdcd74c3f2c6 |
|
BLAKE2b-256 | 4bc2bca1bc0678423dfc1619bd5146c7e4d23351b610c58967e9f52dfeeab3e5 |
File details
Details for the file openinference_instrumentation_mistralai-1.1.0-py3-none-any.whl
.
File metadata
- Download URL: openinference_instrumentation_mistralai-1.1.0-py3-none-any.whl
- Upload date:
- Size: 20.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | aeb6141fc65552a0d0754fc09950aeb55a212bf1b177279db61f8220802f6d3f |
|
MD5 | d9030353d6c3a3920ab50e6a311fe67a |
|
BLAKE2b-256 | cacb6448749821f2c224cbbcb2c5e48c56e9b6450a18682109dee14bd656c339 |