Mistral AI provider for Metorial
Project description
metorial-mistral
Mistral AI provider integration for Metorial.
Installation
pip install metorial-mistral
# or
uv add metorial-mistral
# or
poetry add metorial-mistral
Features
- 🤖 Mistral Integration: Full support for Mistral Large, Codestral, and other Mistral models
- 📡 Session Management: Automatic tool lifecycle handling
- 🔄 Format Conversion: Converts Metorial tools to Mistral function format
- ⚡ Async Support: Full async/await support
Supported Models
All Mistral AI models that support function calling:
mistral-large-latest: Latest Mistral Large model with enhanced reasoningmistral-large-2411: Mistral Large November 2024mistral-large-2407: Mistral Large July 2024mistral-small-latest: Smaller, faster Mistral modelcodestral-latest: Specialized for code generation and analysis
Usage
Quick Start (Recommended)
import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
mistral_client = MistralAsyncClient(
api_key="...your-mistral-api-key..."
)
# One-liner chat with automatic session management
response = await metorial.run(
"What are the latest commits in the metorial/websocket-explorer repository?",
"...your-mcp-server-deployment-id...", # can also be list
mistral_client,
model="mistral-large-latest",
max_iterations=25
)
print("Response:", response)
asyncio.run(main())
Streaming Chat
import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial
from metorial.types import StreamEventType
async def streaming_example():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
mistral_client = MistralAsyncClient(
api_key="...your-mistral-api-key..."
)
# Streaming chat with real-time responses
async def stream_action(session):
messages = [
{"role": "user", "content": "Explain quantum computing"}
]
async for event in metorial.stream(
mistral_client, session, messages,
model="mistral-large-latest",
max_iterations=25
):
if event.type == StreamEventType.CONTENT:
print(f"🤖 {event.content}", end="", flush=True)
elif event.type == StreamEventType.TOOL_CALL:
print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
elif event.type == StreamEventType.COMPLETE:
print(f"\n✅ Complete!")
await metorial.with_session("...your-server-deployment-id...", stream_action)
asyncio.run(streaming_example())
Advanced Usage with Session Management
import asyncio
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from metorial import Metorial
from metorial_mistral import MetorialMistralSession
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
mistral = MistralClient(api_key="...your-mistral-api-key...")
# Create session with your server deployments
async with metorial.session(["...your-server-deployment-id..."]) as session:
# Create Mistral-specific wrapper
mistral_session = MetorialMistralSession(session.tool_manager)
messages = [
ChatMessage(role="user", content="What are the latest commits?")
]
response = mistral.chat(
model="mistral-large-latest",
messages=messages,
tools=mistral_session.tools
)
# Handle tool calls
if response.choices[0].message.tool_calls:
tool_responses = await mistral_session.call_tools(response.choices[0].message.tool_calls)
# Add assistant message and tool responses
messages.append(response.choices[0].message)
messages.extend(tool_responses)
# Continue conversation...
asyncio.run(main())
Using Convenience Functions
from metorial_mistral import build_mistral_tools, call_mistral_tools
async def example_with_functions():
# Get tools in Mistral format
tools = build_mistral_tools(tool_manager)
# Call tools from Mistral response
tool_messages = await call_mistral_tools(tool_manager, tool_calls)
API Reference
MetorialMistralSession
Main session class for Mistral integration.
session = MetorialMistralSession(tool_manager)
Properties:
tools: List of tools in Mistral format
Methods:
async call_tools(tool_calls): Execute tool calls and return tool messages
build_mistral_tools(tool_mgr)
Build Mistral-compatible tool definitions.
Returns: List of tool definitions in Mistral format
call_mistral_tools(tool_mgr, tool_calls)
Execute tool calls from Mistral response.
Returns: List of tool messages
Tool Format
Tools are converted to Mistral's function calling format:
{
"type": "function",
"function": {
"name": "tool_name",
"description": "Tool description",
"parameters": {
"type": "object",
"properties": {...},
"required": [...]
},
"strict": True
}
}
Error Handling
try:
tool_messages = await mistral_session.call_tools(tool_calls)
except Exception as e:
print(f"Tool execution failed: {e}")
Tool errors are returned as tool messages with error content.
License
MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file metorial_mistral-1.0.2.tar.gz.
File metadata
- Download URL: metorial_mistral-1.0.2.tar.gz
- Upload date:
- Size: 6.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
52f0dee91a6c28804f78099f608669228547a219f8e42f1502ccdb11919b9569
|
|
| MD5 |
6feb296b78a2c349cc0650e05b88cc31
|
|
| BLAKE2b-256 |
721439270b92dcbaa884453efdf5d7e945e81ac9ed4b7b53d83f0ef71167e920
|
Provenance
The following attestation bundles were made for metorial_mistral-1.0.2.tar.gz:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_mistral-1.0.2.tar.gz -
Subject digest:
52f0dee91a6c28804f78099f608669228547a219f8e42f1502ccdb11919b9569 - Sigstore transparency entry: 591137377
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@0c0078312698a6061fc623b029d779ca973c41f1 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@0c0078312698a6061fc623b029d779ca973c41f1 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file metorial_mistral-1.0.2-py3-none-any.whl.
File metadata
- Download URL: metorial_mistral-1.0.2-py3-none-any.whl
- Upload date:
- Size: 5.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
be8bdd8f6603a2dd31d63fb11352a4e7b27900d990651a535a9d8d5410dc556f
|
|
| MD5 |
24b7abba3f8890f2f1ee5db2ddcbd975
|
|
| BLAKE2b-256 |
cf0254e2a3e74d8fcd5c5650d514d1e9e5c2bb7057c7e8d69165e0364b3b2167
|
Provenance
The following attestation bundles were made for metorial_mistral-1.0.2-py3-none-any.whl:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_mistral-1.0.2-py3-none-any.whl -
Subject digest:
be8bdd8f6603a2dd31d63fb11352a4e7b27900d990651a535a9d8d5410dc556f - Sigstore transparency entry: 591137387
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@0c0078312698a6061fc623b029d779ca973c41f1 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@0c0078312698a6061fc623b029d779ca973c41f1 -
Trigger Event:
workflow_dispatch
-
Statement type: