XAI (Grok) provider for Metorial
Project description
metorial-xai
XAI (Grok) provider integration for Metorial.
Installation
pip install metorial-xai
# or
uv add metorial-xai
# or
poetry add metorial-xai
Features
- 🤖 Grok Integration: Full support for Grok models
- 📡 Session Management: Automatic tool lifecycle handling
- ✅ Strict Mode: Built-in strict parameter validation
- ⚡ Async Support: Full async/await support
Supported Models
All XAI Grok models that support function calling:
grok-beta: Latest Grok model with enhanced reasoninggrok-2-1212: Grok 2.0 December 2024 releasegrok-2-vision-1212: Grok 2.0 with vision capabilities
Usage
Quick Start (Recommended)
import asyncio
from openai import AsyncOpenAI
from metorial import Metorial
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
xai_client = AsyncOpenAI(
api_key="...your-xai-api-key...",
base_url="https://api.x.ai/v1"
)
# Run with automatic session management
response = await metorial.run(
"What are the latest commits in the metorial/websocket-explorer repository?",
"...your-mcp-server-deployment-id...", # can also be list
xai_client,
model="grok-beta",
max_iterations=25
)
print("Response:", response)
asyncio.run(main())
Streaming Chat
import asyncio
from openai import AsyncOpenAI
from metorial import Metorial
from metorial.types import StreamEventType
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
xai_client = AsyncOpenAI(
api_key="...your-xai-api-key...",
base_url="https://api.x.ai/v1"
)
# Streaming chat with real-time responses
async def stream_action(session):
messages = [
{"role": "user", "content": "Explain quantum computing"}
]
async for event in metorial.stream(
xai_client, session, messages,
model="grok-beta",
max_iterations=25
):
if event.type == StreamEventType.CONTENT:
print(f"🤖 {event.content}", end="", flush=True)
elif event.type == StreamEventType.TOOL_CALL:
print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
elif event.type == StreamEventType.COMPLETE:
print(f"\n✅ Complete!")
await metorial.with_session("...your-server-deployment-id...", stream_action)
asyncio.run(main())
Advanced Usage with Session Management
import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_xai import MetorialXAISession
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
# XAI uses OpenAI-compatible client
xai_client = OpenAI(
api_key="...your-xai-api-key...",
base_url="https://api.x.ai/v1"
)
# Create session with your server deployments
async with metorial.session(["...your-server-deployment-id..."]) as session:
# Create XAI-specific wrapper
xai_session = MetorialXAISession(session.tool_manager)
messages = [
{"role": "user", "content": "What are the latest commits?"}
]
response = xai_client.chat.completions.create(
model="grok-beta",
messages=messages,
tools=xai_session.tools
)
# Handle tool calls
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
tool_responses = await xai_session.call_tools(tool_calls)
# Add to conversation
messages.append({
"role": "assistant",
"tool_calls": tool_calls
})
messages.extend(tool_responses)
# Continue conversation...
asyncio.run(main())
Using Convenience Functions
from metorial_xai import build_xai_tools, call_xai_tools
async def example():
# Get tools in XAI format
tools = build_xai_tools(tool_manager)
# Call tools from XAI response
tool_messages = await call_xai_tools(tool_manager, tool_calls)
API Reference
MetorialXAISession
Main session class for XAI integration.
session = MetorialXAISession(tool_manager)
Properties:
tools: List of tools in OpenAI-compatible format with strict mode
Methods:
async call_tools(tool_calls): Execute tool calls and return tool messages
build_xai_tools(tool_mgr)
Build XAI-compatible tool definitions.
Returns: List of tool definitions in OpenAI format with strict mode
call_xai_tools(tool_mgr, tool_calls)
Execute tool calls from XAI response.
Returns: List of tool messages
Tool Format
Tools are converted to OpenAI-compatible format with strict mode enabled:
{
"type": "function",
"function": {
"name": "tool_name",
"description": "Tool description",
"parameters": {
"type": "object",
"properties": {...},
"required": [...]
},
"strict": True
}
}
XAI API Configuration
XAI uses the OpenAI-compatible API format. Configure your client like this:
from openai import OpenAI
client = OpenAI(
api_key="...your-xai-api-key...",
base_url="https://api.x.ai/v1"
)
Error Handling
try:
response = await metorial.run(
"Your query", "...deployment-id...", xai_client,
model="grok-beta", max_iterations=25
)
except Exception as e:
print(f"Request failed: {e}")
Tool errors are automatically handled and returned as error messages.
License
MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file metorial_xai-1.0.0rc5.tar.gz.
File metadata
- Download URL: metorial_xai-1.0.0rc5.tar.gz
- Upload date:
- Size: 5.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f2ebc6a9a6b035a3e339615a6d8475ffe9a31a0ec31960632f407228bc43d5d1
|
|
| MD5 |
6a07f24985f883f7f7cda9b31a8a14a0
|
|
| BLAKE2b-256 |
b0eeadcac0dd5eea5604dc0f439afaabb0c53a57b3a24cb2f904c4a5ec76fcb9
|
Provenance
The following attestation bundles were made for metorial_xai-1.0.0rc5.tar.gz:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_xai-1.0.0rc5.tar.gz -
Subject digest:
f2ebc6a9a6b035a3e339615a6d8475ffe9a31a0ec31960632f407228bc43d5d1 - Sigstore transparency entry: 543748966
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@dbefb5a24b621248adfab8a843b3bddc52bc3977 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@dbefb5a24b621248adfab8a843b3bddc52bc3977 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file metorial_xai-1.0.0rc5-py3-none-any.whl.
File metadata
- Download URL: metorial_xai-1.0.0rc5-py3-none-any.whl
- Upload date:
- Size: 4.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4fff490fd0bd371b99bc8754316723a02e0960fef061b659ce5d84919e41bd53
|
|
| MD5 |
bf0f7244742dbd8b4843b2c74d5e0362
|
|
| BLAKE2b-256 |
6c7a8c1fbfa49e8e47df251643b9118931d517cdbc9fad636cf7b96794b771e1
|
Provenance
The following attestation bundles were made for metorial_xai-1.0.0rc5-py3-none-any.whl:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_xai-1.0.0rc5-py3-none-any.whl -
Subject digest:
4fff490fd0bd371b99bc8754316723a02e0960fef061b659ce5d84919e41bd53 - Sigstore transparency entry: 543748971
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@dbefb5a24b621248adfab8a843b3bddc52bc3977 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@dbefb5a24b621248adfab8a843b3bddc52bc3977 -
Trigger Event:
workflow_dispatch
-
Statement type: