Skip to main content

Together AI provider for Metorial

Project description

metorial-togetherai

Together AI provider integration for Metorial.

Installation

pip install metorial openai

Quick Start

import asyncio
from metorial import Metorial, MetorialTogetherAI
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
together = AsyncOpenAI(
    api_key="your-together-api-key",
    base_url="https://api.together.xyz/v1"
)

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        for _ in range(10):
            response = await together.chat.completions.create(
                model="meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
                messages=messages,
                tools=session["tools"]
            )

            choice = response.choices[0]
            tool_calls = choice.message.tool_calls

            if not tool_calls:
                print(choice.message.content)
                break

            tool_responses = await session["callTools"](tool_calls)
            messages.append({"role": "assistant", "tool_calls": tool_calls})
            messages.extend(tool_responses)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialTogetherAI.chat_completions,
        {"serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}]},
        session_handler
    )

asyncio.run(main())

Streaming

import asyncio
from metorial import Metorial, MetorialTogetherAI
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
together = AsyncOpenAI(
    api_key="your-together-api-key",
    base_url="https://api.together.xyz/v1"
)

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        stream = await together.chat.completions.create(
            model="meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
            messages=messages,
            tools=session["tools"],
            stream=True
        )

        async for chunk in stream:
            if chunk.choices[0].delta.content:
                print(chunk.choices[0].delta.content, end="", flush=True)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialTogetherAI.chat_completions,
        {
            "serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}],
            "streaming": True,  # Required for streaming with tool calls
        },
        session_handler
    )

asyncio.run(main())

Supported Models

  • Llama-4, Qwen-3, and other models available on Together AI

Session Object

async def session_handler(session):
    tools = session["tools"]           # Tool definitions in OpenAI-compatible format
    call_tools = session["callTools"]  # Execute tools and get responses
    close_session = session["closeSession"]  # Close the session when done

Error Handling

from metorial import MetorialAPIError

try:
    await metorial.with_provider_session(...)
except MetorialAPIError as e:
    print(f"API Error: {e.message} (Status: {e.status})")
except Exception as e:
    print(f"Unexpected error: {e}")

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_togetherai-1.0.5.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_togetherai-1.0.5-py3-none-any.whl (4.1 kB view details)

Uploaded Python 3

File details

Details for the file metorial_togetherai-1.0.5.tar.gz.

File metadata

  • Download URL: metorial_togetherai-1.0.5.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_togetherai-1.0.5.tar.gz
Algorithm Hash digest
SHA256 0174e5f350779335392513dec9e19794eb79ef2df7d14536dd23197e210a1662
MD5 9dac0ea2828d0a8ce9e969a5b2f1c030
BLAKE2b-256 a0177e3cd9c28bb4b3b8f3249d07152a44a5128a173bac48a022e9c3405bd1f9

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_togetherai-1.0.5.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_togetherai-1.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_togetherai-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 01f9a9ecc483845ab7633d96f11e4e908a3507218ae5b83b18aa8baaeac2a02a
MD5 03bc5d054fc8ca2208c4a78d0211c925
BLAKE2b-256 97cb4c8395d41e4fa1e0c6b64fde3b0cc32bb8fe1c267f35b05d659df981fec5

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_togetherai-1.0.5-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page