Skip to main content

XAI (Grok) provider for Metorial

Project description

metorial-xai

XAI (Grok) provider integration for Metorial.

Installation

pip install metorial-xai
# or
uv add metorial-xai
# or
poetry add metorial-xai

Features

  • 🤖 Grok Integration: Full support for Grok models
  • 📡 Session Management: Automatic tool lifecycle handling
  • Strict Mode: Built-in strict parameter validation
  • Async Support: Full async/await support

Supported Models

All XAI Grok models that support function calling:

  • grok-beta: Latest Grok model with enhanced reasoning
  • grok-2-1212: Grok 2.0 December 2024 release
  • grok-2-vision-1212: Grok 2.0 with vision capabilities

Usage

Quick Start (Recommended)

import asyncio
from openai import AsyncOpenAI
from metorial import Metorial

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
  xai_client = AsyncOpenAI(
    api_key="...your-xai-api-key...", 
    base_url="https://api.x.ai/v1"
  )
  
  # Run with automatic session management
  response = await metorial.run(
    "What are the latest commits in the metorial/websocket-explorer repository?",
    "...your-mcp-server-deployment-id...", # can also be list
    xai_client,
    model="grok-beta",
    max_iterations=25
  )
  
  print("Response:", response)

asyncio.run(main())

Streaming Chat

import asyncio
from openai import AsyncOpenAI
from metorial import Metorial
from metorial.types import StreamEventType

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  xai_client = AsyncOpenAI(
    api_key="...your-xai-api-key...",
    base_url="https://api.x.ai/v1"
  )
  
  # Streaming chat with real-time responses
  async def stream_action(session):
    messages = [
      {"role": "user", "content": "Explain quantum computing"}
    ]
    
    async for event in metorial.stream(
      xai_client, session, messages, 
      model="grok-beta",
      max_iterations=25
    ):
      if event.type == StreamEventType.CONTENT:
        print(f"🤖 {event.content}", end="", flush=True)
      elif event.type == StreamEventType.TOOL_CALL:
        print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
      elif event.type == StreamEventType.COMPLETE:
        print(f"\n✅ Complete!")
  
  await metorial.with_session("...your-server-deployment-id...", stream_action)

asyncio.run(main())

Advanced Usage with Session Management

import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_xai import MetorialXAISession

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  
  # XAI uses OpenAI-compatible client
  xai_client = OpenAI(
    api_key="...your-xai-api-key...",
    base_url="https://api.x.ai/v1"
  )
  
  # Create session with your server deployments
  async with metorial.session(["...your-server-deployment-id..."]) as session:
    # Create XAI-specific wrapper
    xai_session = MetorialXAISession(session.tool_manager)
    
    messages = [
      {"role": "user", "content": "What are the latest commits?"}
    ]
    
    response = xai_client.chat.completions.create(
      model="grok-beta",
      messages=messages,
      tools=xai_session.tools
    )
    
    # Handle tool calls
    tool_calls = response.choices[0].message.tool_calls
    if tool_calls:
      tool_responses = await xai_session.call_tools(tool_calls)
      
      # Add to conversation
      messages.append({
        "role": "assistant",
        "tool_calls": tool_calls
      })
      messages.extend(tool_responses)
      
      # Continue conversation...

asyncio.run(main())

Using Convenience Functions

from metorial_xai import build_xai_tools, call_xai_tools

async def example():
  # Get tools in XAI format
  tools = build_xai_tools(tool_manager)
  
  # Call tools from XAI response
  tool_messages = await call_xai_tools(tool_manager, tool_calls)

API Reference

MetorialXAISession

Main session class for XAI integration.

session = MetorialXAISession(tool_manager)

Properties:

  • tools: List of tools in OpenAI-compatible format with strict mode

Methods:

  • async call_tools(tool_calls): Execute tool calls and return tool messages

build_xai_tools(tool_mgr)

Build XAI-compatible tool definitions.

Returns: List of tool definitions in OpenAI format with strict mode

call_xai_tools(tool_mgr, tool_calls)

Execute tool calls from XAI response.

Returns: List of tool messages

Tool Format

Tools are converted to OpenAI-compatible format with strict mode enabled:

{
  "type": "function",
  "function": {
    "name": "tool_name",
    "description": "Tool description",
    "parameters": {
      "type": "object",
      "properties": {...},
      "required": [...]
    },
    "strict": True
  }
}

XAI API Configuration

XAI uses the OpenAI-compatible API format. Configure your client like this:

from openai import OpenAI

client = OpenAI(
  api_key="...your-xai-api-key...",
  base_url="https://api.x.ai/v1"
)

Error Handling

try:
  response = await metorial.run(
    "Your query", "...deployment-id...", xai_client, 
    model="grok-beta", max_iterations=25
  )
except Exception as e:
  print(f"Request failed: {e}")

Tool errors are automatically handled and returned as error messages.

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_xai-1.0.2.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_xai-1.0.2-py3-none-any.whl (4.6 kB view details)

Uploaded Python 3

File details

Details for the file metorial_xai-1.0.2.tar.gz.

File metadata

  • Download URL: metorial_xai-1.0.2.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_xai-1.0.2.tar.gz
Algorithm Hash digest
SHA256 57cdb30b2fd44a64a933d79b69c48cd165601887e0e5a6ae8167eb92284efcca
MD5 a9ff196b3d3a5a8a8a6b91494277d2a3
BLAKE2b-256 a2dfb014ee472cce6e9495a048d1ee2dc8e9e50aba7151ec4a420a06589b4ae9

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_xai-1.0.2.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_xai-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: metorial_xai-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 4.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_xai-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 dbddcf2e2092b3d7fc0cdc367afa9e72170caf75388742c13984f2584022115f
MD5 76f9b226cca141d267af76284acf517a
BLAKE2b-256 3e493f23c970da3f2926842e01b82e071475440b34dd4331309ca00badbf81e2

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_xai-1.0.2-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page