Skip to main content

Mistral AI provider for Metorial

Project description

metorial-mistral

Mistral AI provider integration for Metorial.

Installation

pip install metorial-mistral
# or
uv add metorial-mistral
# or
poetry add metorial-mistral

Features

  • 🤖 Mistral Integration: Full support for Mistral Large, Codestral, and other Mistral models
  • 📡 Session Management: Automatic tool lifecycle handling
  • 🔄 Format Conversion: Converts Metorial tools to Mistral function format
  • Async Support: Full async/await support

Supported Models

All Mistral AI models that support function calling:

  • mistral-large-latest: Latest Mistral Large model with enhanced reasoning
  • mistral-large-2411: Mistral Large November 2024
  • mistral-large-2407: Mistral Large July 2024
  • mistral-small-latest: Smaller, faster Mistral model
  • codestral-latest: Specialized for code generation and analysis

Usage

Quick Start (Recommended)

import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
  mistral_client = MistralAsyncClient(
    api_key="...your-mistral-api-key..."
  )
  
  # One-liner chat with automatic session management
  response = await metorial.run(
    "What are the latest commits in the metorial/websocket-explorer repository?",
    "...your-mcp-server-deployment-id...", # can also be list
    mistral_client,
    model="mistral-large-latest",
    max_iterations=25
  )
  
  print("Response:", response)

asyncio.run(main())

Streaming Chat

import asyncio
from mistralai.async_client import MistralAsyncClient
from metorial import Metorial
from metorial.types import StreamEventType

async def streaming_example():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  mistral_client = MistralAsyncClient(
    api_key="...your-mistral-api-key..."
  )
  
  # Streaming chat with real-time responses
  async def stream_action(session):
    messages = [
      {"role": "user", "content": "Explain quantum computing"}
    ]
    
    async for event in metorial.stream(
      mistral_client, session, messages, 
      model="mistral-large-latest",
      max_iterations=25
    ):
      if event.type == StreamEventType.CONTENT:
        print(f"🤖 {event.content}", end="", flush=True)
      elif event.type == StreamEventType.TOOL_CALL:
        print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
      elif event.type == StreamEventType.COMPLETE:
        print(f"\n✅ Complete!")
  
  await metorial.with_session("...your-server-deployment-id...", stream_action)

asyncio.run(streaming_example())

Advanced Usage with Session Management

import asyncio
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from metorial import Metorial
from metorial_mistral import MetorialMistralSession

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  mistral = MistralClient(api_key="...your-mistral-api-key...")
  
  # Create session with your server deployments
  async with metorial.session(["...your-server-deployment-id..."]) as session:
    # Create Mistral-specific wrapper
    mistral_session = MetorialMistralSession(session.tool_manager)
    
    messages = [
      ChatMessage(role="user", content="What are the latest commits?")
    ]
    
    response = mistral.chat(
      model="mistral-large-latest",
      messages=messages,
      tools=mistral_session.tools
    )
    
    # Handle tool calls
    if response.choices[0].message.tool_calls:
      tool_responses = await mistral_session.call_tools(response.choices[0].message.tool_calls)
      
      # Add assistant message and tool responses
      messages.append(response.choices[0].message)
      messages.extend(tool_responses)
      
      # Continue conversation...

asyncio.run(main())

Using Convenience Functions

from metorial_mistral import build_mistral_tools, call_mistral_tools

async def example_with_functions():
  # Get tools in Mistral format
  tools = build_mistral_tools(tool_manager)
  
  # Call tools from Mistral response
  tool_messages = await call_mistral_tools(tool_manager, tool_calls)

API Reference

MetorialMistralSession

Main session class for Mistral integration.

session = MetorialMistralSession(tool_manager)

Properties:

  • tools: List of tools in Mistral format

Methods:

  • async call_tools(tool_calls): Execute tool calls and return tool messages

build_mistral_tools(tool_mgr)

Build Mistral-compatible tool definitions.

Returns: List of tool definitions in Mistral format

call_mistral_tools(tool_mgr, tool_calls)

Execute tool calls from Mistral response.

Returns: List of tool messages

Tool Format

Tools are converted to Mistral's function calling format:

{
  "type": "function",
  "function": {
    "name": "tool_name",
    "description": "Tool description",
    "parameters": {
      "type": "object",
      "properties": {...},
      "required": [...]
    },
    "strict": True
  }
}

Error Handling

try:
    tool_messages = await mistral_session.call_tools(tool_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")

Tool errors are returned as tool messages with error content.

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_mistral-1.0.0rc7.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_mistral-1.0.0rc7-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file metorial_mistral-1.0.0rc7.tar.gz.

File metadata

  • Download URL: metorial_mistral-1.0.0rc7.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_mistral-1.0.0rc7.tar.gz
Algorithm Hash digest
SHA256 bb3a4a0c3d3440a551a2ee264799e2b6f88e3cd132eb330685a48d1cd501a800
MD5 26a4eef36e8bd502feddd4bcd3f86309
BLAKE2b-256 56526d3be959d3b713ac93c9c1a44ef6a09df2759f216691fa26b663a465a423

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_mistral-1.0.0rc7.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_mistral-1.0.0rc7-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_mistral-1.0.0rc7-py3-none-any.whl
Algorithm Hash digest
SHA256 f04fe7a258e7b0daff093670c31bded0cdde377116707215d4b14d26f9fbc062
MD5 6dd1ad21170f1886edacfbad520eb928
BLAKE2b-256 a7d269679046711bbd383301c84da24ece680b9264784712c0179a3b86700b5e

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_mistral-1.0.0rc7-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page