Skip to main content

DeepSeek provider for Metorial

Project description

metorial-deepseek

DeepSeek provider integration for Metorial.

Installation

pip install metorial openai

Quick Start

import asyncio
from metorial import Metorial, MetorialDeepSeek
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
deepseek = AsyncOpenAI(
    api_key="your-deepseek-api-key",
    base_url="https://api.deepseek.com"
)

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        for _ in range(10):
            response = await deepseek.chat.completions.create(
                model="deepseek-chat",
                messages=messages,
                tools=session["tools"]
            )

            choice = response.choices[0]
            tool_calls = choice.message.tool_calls

            if not tool_calls:
                print(choice.message.content)
                break

            tool_responses = await session["callTools"](tool_calls)
            messages.append({"role": "assistant", "tool_calls": tool_calls})
            messages.extend(tool_responses)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialDeepSeek.chat_completions,
        {"serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}]},
        session_handler
    )

asyncio.run(main())

Streaming

import asyncio
from metorial import Metorial, MetorialDeepSeek
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
deepseek = AsyncOpenAI(
    api_key="your-deepseek-api-key",
    base_url="https://api.deepseek.com"
)

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        stream = await deepseek.chat.completions.create(
            model="deepseek-chat",
            messages=messages,
            tools=session["tools"],
            stream=True
        )

        async for chunk in stream:
            if chunk.choices[0].delta.content:
                print(chunk.choices[0].delta.content, end="", flush=True)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialDeepSeek.chat_completions,
        {
            "serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}],
            "streaming": True,  # Required for streaming with tool calls
        },
        session_handler
    )

asyncio.run(main())

Supported Models

  • deepseek-chat: DeepSeek Chat
  • deepseek-reasoner: DeepSeek Reasoner

Session Object

async def session_handler(session):
    tools = session["tools"]           # Tool definitions in OpenAI-compatible format
    call_tools = session["callTools"]  # Execute tools and get responses
    close_session = session["closeSession"]  # Close the session when done

Error Handling

from metorial import MetorialAPIError

try:
    await metorial.with_provider_session(...)
except MetorialAPIError as e:
    print(f"API Error: {e.message} (Status: {e.status})")
except Exception as e:
    print(f"Unexpected error: {e}")

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_deepseek-1.0.5.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_deepseek-1.0.5-py3-none-any.whl (4.0 kB view details)

Uploaded Python 3

File details

Details for the file metorial_deepseek-1.0.5.tar.gz.

File metadata

  • Download URL: metorial_deepseek-1.0.5.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_deepseek-1.0.5.tar.gz
Algorithm Hash digest
SHA256 051583a495b481f3d10f78374b7c99f52a06703181b2fd75516efa330891fd92
MD5 9fbcc5346bf0bd14b4ee475a5f5d9e90
BLAKE2b-256 613e1aae6b81cf7405dfab64801f299765b1bb02653529a6a5c3c5df3528eb01

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_deepseek-1.0.5.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_deepseek-1.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_deepseek-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 20220c2e18a36e3f271f29c6d6984abc43782d9714991b562ea9ecd30046d44e
MD5 4c56d390dc98d45fff0db2260f06186e
BLAKE2b-256 f0be1c1f3d915ffe34ce345a939520ddf33ed942260dd6230136df09be289437

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_deepseek-1.0.5-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page