Skip to main content

Mistral AI provider for Metorial

Project description

metorial-mistral

Mistral AI provider integration for Metorial.

Installation

pip install metorial-mistral
# or
uv add metorial-mistral
# or
poetry add metorial-mistral

Features

  • 🤖 Mistral Integration: Full support for Mistral Large, Codestral, and other Mistral models
  • 📡 Session Management: Automatic tool lifecycle handling
  • 🔄 Format Conversion: Converts Metorial tools to Mistral function format
  • Async Support: Full async/await support

Supported Models

All Mistral AI models that support function calling:

  • mistral-large-latest: Latest Mistral Large model with enhanced reasoning
  • mistral-large-2411: Mistral Large November 2024
  • mistral-large-2407: Mistral Large July 2024
  • mistral-small-latest: Smaller, faster Mistral model
  • codestral-latest: Specialized for code generation and analysis

Usage

Quick Start (Recommended)

import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
  mistral_client = MistralAsyncClient(
    api_key="...your-mistral-api-key..."
  )
  
  # One-liner chat with automatic session management
  response = await metorial.run(
    "What are the latest commits in the metorial/websocket-explorer repository?",
    "...your-mcp-server-deployment-id...", # can also be list
    mistral_client,
    model="mistral-large-latest",
    max_iterations=25
  )
  
  print("Response:", response)

asyncio.run(main())

Streaming Chat

import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial
from metorial.types import StreamEventType

async def streaming_example():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  mistral_client = MistralAsyncClient(
    api_key="...your-mistral-api-key..."
  )
  
  # Streaming chat with real-time responses
  async def stream_action(session):
    messages = [
      {"role": "user", "content": "Explain quantum computing"}
    ]
    
    async for event in metorial.stream(
      mistral_client, session, messages, 
      model="mistral-large-latest",
      max_iterations=25
    ):
      if event.type == StreamEventType.CONTENT:
        print(f"🤖 {event.content}", end="", flush=True)
      elif event.type == StreamEventType.TOOL_CALL:
        print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
      elif event.type == StreamEventType.COMPLETE:
        print(f"\n✅ Complete!")
  
  await metorial.with_session("...your-server-deployment-id...", stream_action)

asyncio.run(streaming_example())

Advanced Usage with Session Management

import asyncio
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from metorial import Metorial
from metorial_mistral import MetorialMistralSession

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  mistral = MistralClient(api_key="...your-mistral-api-key...")
  
  # Create session with your server deployments
  async with metorial.session(["...your-server-deployment-id..."]) as session:
    # Create Mistral-specific wrapper
    mistral_session = MetorialMistralSession(session.tool_manager)
    
    messages = [
      ChatMessage(role="user", content="What are the latest commits?")
    ]
    
    response = mistral.chat(
      model="mistral-large-latest",
      messages=messages,
      tools=mistral_session.tools
    )
    
    # Handle tool calls
    if response.choices[0].message.tool_calls:
      tool_responses = await mistral_session.call_tools(response.choices[0].message.tool_calls)
      
      # Add assistant message and tool responses
      messages.append(response.choices[0].message)
      messages.extend(tool_responses)
      
      # Continue conversation...

asyncio.run(main())

Using Convenience Functions

from metorial_mistral import build_mistral_tools, call_mistral_tools

async def example_with_functions():
  # Get tools in Mistral format
  tools = build_mistral_tools(tool_manager)
  
  # Call tools from Mistral response
  tool_messages = await call_mistral_tools(tool_manager, tool_calls)

API Reference

MetorialMistralSession

Main session class for Mistral integration.

session = MetorialMistralSession(tool_manager)

Properties:

  • tools: List of tools in Mistral format

Methods:

  • async call_tools(tool_calls): Execute tool calls and return tool messages

build_mistral_tools(tool_mgr)

Build Mistral-compatible tool definitions.

Returns: List of tool definitions in Mistral format

call_mistral_tools(tool_mgr, tool_calls)

Execute tool calls from Mistral response.

Returns: List of tool messages

Tool Format

Tools are converted to Mistral's function calling format:

{
  "type": "function",
  "function": {
    "name": "tool_name",
    "description": "Tool description",
    "parameters": {
      "type": "object",
      "properties": {...},
      "required": [...]
    },
    "strict": True
  }
}

Error Handling

try:
    tool_messages = await mistral_session.call_tools(tool_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")

Tool errors are returned as tool messages with error content.

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_mistral-1.0.0rc6.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_mistral-1.0.0rc6-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file metorial_mistral-1.0.0rc6.tar.gz.

File metadata

  • Download URL: metorial_mistral-1.0.0rc6.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_mistral-1.0.0rc6.tar.gz
Algorithm Hash digest
SHA256 7facd0731d78c472796254bd6d9c706c47835959eaa87152da44665f2a8b21b3
MD5 1a8cc70067f7a361552915bed39596bc
BLAKE2b-256 588b0036a5d12591cb5e98b892281ab3e87d787f13a328bd662324e65329fa07

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_mistral-1.0.0rc6.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_mistral-1.0.0rc6-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_mistral-1.0.0rc6-py3-none-any.whl
Algorithm Hash digest
SHA256 7bb7f9472350abe0077ce3e8d471076c704f1573b2a953cda97620df66f2051a
MD5 f381d137a27a69b26ab9ab9d4cdae174
BLAKE2b-256 d37f635cd3f63a8571a6b72baabce399d3e324545e1a66f3efdd6a1eb1e64112

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_mistral-1.0.0rc6-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page