Skip to main content

Mistral AI provider for Metorial

Project description

metorial-mistral

Mistral AI provider integration for Metorial.

Installation

pip install metorial-mistral
# or
uv add metorial-mistral
# or
poetry add metorial-mistral

Features

  • 🤖 Mistral Integration: Full support for Mistral Large, Codestral, and other Mistral models
  • 📡 Session Management: Automatic tool lifecycle handling
  • 🔄 Format Conversion: Converts Metorial tools to Mistral function format
  • Async Support: Full async/await support

Supported Models

All Mistral AI models that support function calling:

  • mistral-large-latest: Latest Mistral Large model with enhanced reasoning
  • mistral-large-2411: Mistral Large November 2024
  • mistral-large-2407: Mistral Large July 2024
  • mistral-small-latest: Smaller, faster Mistral model
  • codestral-latest: Specialized for code generation and analysis

Usage

Quick Start (Recommended)

import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
  mistral_client = MistralAsyncClient(
    api_key="...your-mistral-api-key..."
  )
  
  # One-liner chat with automatic session management
  response = await metorial.run(
    "What are the latest commits in the metorial/websocket-explorer repository?",
    "...your-mcp-server-deployment-id...", # can also be list
    mistral_client,
    model="mistral-large-latest",
    max_iterations=25
  )
  
  print("Response:", response)

asyncio.run(main())

Streaming Chat

import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial
from metorial.types import StreamEventType

async def streaming_example():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  mistral_client = MistralAsyncClient(
    api_key="...your-mistral-api-key..."
  )
  
  # Streaming chat with real-time responses
  async def stream_action(session):
    messages = [
      {"role": "user", "content": "Explain quantum computing"}
    ]
    
    async for event in metorial.stream(
      mistral_client, session, messages, 
      model="mistral-large-latest",
      max_iterations=25
    ):
      if event.type == StreamEventType.CONTENT:
        print(f"🤖 {event.content}", end="", flush=True)
      elif event.type == StreamEventType.TOOL_CALL:
        print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
      elif event.type == StreamEventType.COMPLETE:
        print(f"\n✅ Complete!")
  
  await metorial.with_session("...your-server-deployment-id...", stream_action)

asyncio.run(streaming_example())

Advanced Usage with Session Management

import asyncio
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from metorial import Metorial
from metorial_mistral import MetorialMistralSession

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  mistral = MistralClient(api_key="...your-mistral-api-key...")
  
  # Create session with your server deployments
  async with metorial.session(["...your-server-deployment-id..."]) as session:
    # Create Mistral-specific wrapper
    mistral_session = MetorialMistralSession(session.tool_manager)
    
    messages = [
      ChatMessage(role="user", content="What are the latest commits?")
    ]
    
    response = mistral.chat(
      model="mistral-large-latest",
      messages=messages,
      tools=mistral_session.tools
    )
    
    # Handle tool calls
    if response.choices[0].message.tool_calls:
      tool_responses = await mistral_session.call_tools(response.choices[0].message.tool_calls)
      
      # Add assistant message and tool responses
      messages.append(response.choices[0].message)
      messages.extend(tool_responses)
      
      # Continue conversation...

asyncio.run(main())

Using Convenience Functions

from metorial_mistral import build_mistral_tools, call_mistral_tools

async def example_with_functions():
  # Get tools in Mistral format
  tools = build_mistral_tools(tool_manager)
  
  # Call tools from Mistral response
  tool_messages = await call_mistral_tools(tool_manager, tool_calls)

API Reference

MetorialMistralSession

Main session class for Mistral integration.

session = MetorialMistralSession(tool_manager)

Properties:

  • tools: List of tools in Mistral format

Methods:

  • async call_tools(tool_calls): Execute tool calls and return tool messages

build_mistral_tools(tool_mgr)

Build Mistral-compatible tool definitions.

Returns: List of tool definitions in Mistral format

call_mistral_tools(tool_mgr, tool_calls)

Execute tool calls from Mistral response.

Returns: List of tool messages

Tool Format

Tools are converted to Mistral's function calling format:

{
  "type": "function",
  "function": {
    "name": "tool_name",
    "description": "Tool description",
    "parameters": {
      "type": "object",
      "properties": {...},
      "required": [...]
    },
    "strict": True
  }
}

Error Handling

try:
    tool_messages = await mistral_session.call_tools(tool_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")

Tool errors are returned as tool messages with error content.

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_mistral-1.0.0rc5.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_mistral-1.0.0rc5-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file metorial_mistral-1.0.0rc5.tar.gz.

File metadata

  • Download URL: metorial_mistral-1.0.0rc5.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_mistral-1.0.0rc5.tar.gz
Algorithm Hash digest
SHA256 f26e34a1fd84fb356d0805560f2f2c34e584cf4b4022b4a147f6295533c9b8d6
MD5 9fbb76061105f2333c689835233e39b3
BLAKE2b-256 2a85ad23a2918ab31e2a13130a472926ae2fce73ecf5a7085e4c2ea7600aa143

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_mistral-1.0.0rc5.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_mistral-1.0.0rc5-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_mistral-1.0.0rc5-py3-none-any.whl
Algorithm Hash digest
SHA256 5fa7aa34b312cf28259e34a132a26546b1355cff2af3278389ee81944d7512a4
MD5 fbec5a6203b96eedda1f523ec025c48c
BLAKE2b-256 5138bcaa0aafcb95741213239f59eb623c546f41a13a6ea544572181946b80f9

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_mistral-1.0.0rc5-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page