Skip to main content

Python Client SDK for the Mistral AI API.

Project description

Open and portable generative AI for devs and businesses

The Mistral Python library provides convenient access to the Mistral REST API from any Python 3.7+ application. The library includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients powered by httpx.

[!WARNING]
This SDK is not yet ready for production use and has not been published to a package manager.

SDK Installation

PIP

pip install mistral-dev

Poetry

poetry add mistral-dev

SDK Example Usage

Create Chat Completions

This example shows how to create chat completions.

# Synchronous Example
import mistral_dev
from mistral_dev import Mistral

s = Mistral(
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)


res = s.chat.stream(request={
    "model": "mistral-small-latest",
    "messages": [
        {
            "role": mistral_dev.ChatCompletionRole.USER,
            "content": "Who is the best French painter? Answer in JSON.",
        },
    ],
    "response_format": {
        "type": "json_object",
    },
    "max_tokens": 512,
    "random_seed": 1337,
})

if res is not None:
    for event in res:
        # handle event
        print(event)

The same SDK client can also be used to make asychronous requests by importing asyncio.

# Asynchronous Example
import asyncio
import mistral_dev
from mistral_dev import Mistral

async def main():
    s = Mistral(
        api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
    )
    res = await s.chat.stream_async(request={
        "model": "mistral-small-latest",
        "messages": [
            {
                "role": mistral_dev.ChatCompletionRole.USER,
                "content": "Who is the best French painter? Answer in JSON.",
            },
        ],
        "response_format": {
            "type": "json_object",
        },
        "max_tokens": 512,
        "random_seed": 1337,
    })
    if res is not None:
        for event in res:
            # handle event
            print(event)

asyncio.run(main())

Available Resources and Operations

chat

  • stream - Create Chat Completions Stream
  • create - Create Chat Completions

fim

  • create - Create FIM Completions

embeddings

models

  • list - List Available Models

files

fine_tuning

Server-sent event streaming

Server-sent events are used to stream content from certain operations. These operations will expose the stream as Generator that can be consumed using a simple for loop. The loop will terminate when the server no longer has any events to send and closes the underlying connection.

import mistral_dev
from mistral_dev import Mistral

s = Mistral(
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)


res = s.chat.stream(request={
    "model": "mistral-small-latest",
    "messages": [
        {
            "role": mistral_dev.ChatCompletionRole.USER,
            "content": "Who is the best French painter? Answer in JSON.",
        },
    ],
    "response_format": {
        "type": "json_object",
    },
    "max_tokens": 512,
    "random_seed": 1337,
})

if res is not None:
    for event in res:
        # handle event
        print(event)

File uploads

Certain SDK methods accept file objects as part of a request body or multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.

[!TIP]

For endpoints that handle file uploads bytes arrays can also be used. However, using streams is recommended for large files.

from mistral_dev import Mistral

s = Mistral(
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)


res = s.files.upload(purpose="fine-tune", file={
    "file_name": "your_file_here",
    "content": open("<file_path>", "rb"),
})

if res is not None:
    # handle response
    pass

Error Handling

Handling errors in this SDK should largely match your expectations. All operations return a response object or raise an error. If Error objects are specified in your OpenAPI Spec, the SDK will raise the appropriate Error type.

Error Object Status Code Content Type
models.BadRequest 400 application/json
models.Unauthorized 401 application/json
models.Forbidden 403 application/json
models.NotFound 404 application/json
models.TooManyRequests 429 application/json
models.InternalServerError 500 application/json
models.ServiceUnavailable 503 application/json
models.SDKError 4xx-5xx /

Example

import mistral_dev
from mistral_dev import Mistral, models

s = Mistral(
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)

res = None
try:
    res = s.chat.stream(request={
    "model": "mistral-small-latest",
    "messages": [
        {
            "role": mistral_dev.ChatCompletionRole.USER,
            "content": "Who is the best French painter? Answer in JSON.",
        },
    ],
    "response_format": {
        "type": "json_object",
    },
    "max_tokens": 512,
    "random_seed": 1337,
})

except models.BadRequest as e:
    # handle exception
    raise(e)
except models.Unauthorized as e:
    # handle exception
    raise(e)
except models.Forbidden as e:
    # handle exception
    raise(e)
except models.NotFound as e:
    # handle exception
    raise(e)
except models.TooManyRequests as e:
    # handle exception
    raise(e)
except models.InternalServerError as e:
    # handle exception
    raise(e)
except models.ServiceUnavailable as e:
    # handle exception
    raise(e)
except models.SDKError as e:
    # handle exception
    raise(e)

if res is not None:
    for event in res:
        # handle event
        print(event)

Server Selection

Select Server by Name

You can override the default server globally by passing a server name to the server: str optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the names associated with the available servers:

Name Server Variables
prod https://api.mistral.ai/v1 None

Example

import mistral_dev
from mistral_dev import Mistral

s = Mistral(
    server="prod",
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)


res = s.chat.stream(request={
    "model": "mistral-small-latest",
    "messages": [
        {
            "role": mistral_dev.ChatCompletionRole.USER,
            "content": "Who is the best French painter? Answer in JSON.",
        },
    ],
    "response_format": {
        "type": "json_object",
    },
    "max_tokens": 512,
    "random_seed": 1337,
})

if res is not None:
    for event in res:
        # handle event
        print(event)

Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the server_url: str optional parameter when initializing the SDK client instance. For example:

import mistral_dev
from mistral_dev import Mistral

s = Mistral(
    server_url="https://api.mistral.ai/v1",
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)


res = s.chat.stream(request={
    "model": "mistral-small-latest",
    "messages": [
        {
            "role": mistral_dev.ChatCompletionRole.USER,
            "content": "Who is the best French painter? Answer in JSON.",
        },
    ],
    "response_format": {
        "type": "json_object",
    },
    "max_tokens": 512,
    "random_seed": 1337,
})

if res is not None:
    for event in res:
        # handle event
        print(event)

Custom HTTP Client

The Python SDK makes API calls using the httpx HTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance. Depending on whether you are using the sync or async version of the SDK, you can pass an instance of HttpClient or AsyncHttpClient respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls. This allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of httpx.Client or httpx.AsyncClient directly.

For example, you could specify a header for every request that this sdk makes as follows:

from mistral_dev import Mistral
import httpx

http_client = httpx.Client(headers={"x-custom-header": "someValue"})
s = Mistral(client=http_client)

or you could wrap the client with your own custom logic:

from mistral_dev import Mistral
from mistral_dev.httpclient import AsyncHttpClient
import httpx

class CustomClient(AsyncHttpClient):
    client: AsyncHttpClient

    def __init__(self, client: AsyncHttpClient):
        self.client = client

    async def send(
        self,
        request: httpx.Request,
        *,
        stream: bool = False,
        auth: Union[
            httpx._types.AuthTypes, httpx._client.UseClientDefault, None
        ] = httpx.USE_CLIENT_DEFAULT,
        follow_redirects: Union[
            bool, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
    ) -> httpx.Response:
        request.headers["Client-Level-Header"] = "added by client"

        return await self.client.send(
            request, stream=stream, auth=auth, follow_redirects=follow_redirects
        )

    def build_request(
        self,
        method: str,
        url: httpx._types.URLTypes,
        *,
        content: Optional[httpx._types.RequestContent] = None,
        data: Optional[httpx._types.RequestData] = None,
        files: Optional[httpx._types.RequestFiles] = None,
        json: Optional[Any] = None,
        params: Optional[httpx._types.QueryParamTypes] = None,
        headers: Optional[httpx._types.HeaderTypes] = None,
        cookies: Optional[httpx._types.CookieTypes] = None,
        timeout: Union[
            httpx._types.TimeoutTypes, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
        extensions: Optional[httpx._types.RequestExtensions] = None,
    ) -> httpx.Request:
        return self.client.build_request(
            method,
            url,
            content=content,
            data=data,
            files=files,
            json=json,
            params=params,
            headers=headers,
            cookies=cookies,
            timeout=timeout,
            extensions=extensions,
        )

s = Mistral(async_client=CustomClient(httpx.AsyncClient()))

Authentication

Per-Client Security Schemes

This SDK supports the following security scheme globally:

Name Type Scheme
api_key_auth http HTTP Bearer

To authenticate with the API the api_key_auth parameter must be set when initializing the SDK client instance. For example:

import mistral_dev
from mistral_dev import Mistral

s = Mistral(
    api_key_auth="<YOUR_BEARER_TOKEN_HERE>",
)


res = s.chat.stream(request={
    "model": "mistral-small-latest",
    "messages": [
        {
            "role": mistral_dev.ChatCompletionRole.USER,
            "content": "Who is the best French painter? Answer in JSON.",
        },
    ],
    "response_format": {
        "type": "json_object",
    },
    "max_tokens": 512,
    "random_seed": 1337,
})

if res is not None:
    for event in res:
        # handle event
        print(event)

Development

Maturity

This SDK is in beta, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage to a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally looking for the latest version.

Contributions

While we value open-source contributions to this SDK, this library is generated programmatically. Feel free to open a PR or a Github issue as a proof of concept and we'll do our best to include it in a future release!

SDK Created by Speakeasy

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mistral_dev-0.5.4a0.tar.gz (43.2 kB view details)

Uploaded Source

Built Distribution

mistral_dev-0.5.4a0-py3-none-any.whl (80.8 kB view details)

Uploaded Python 3

File details

Details for the file mistral_dev-0.5.4a0.tar.gz.

File metadata

  • Download URL: mistral_dev-0.5.4a0.tar.gz
  • Upload date:
  • Size: 43.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.8.18 Linux/6.5.0-1022-azure

File hashes

Hashes for mistral_dev-0.5.4a0.tar.gz
Algorithm Hash digest
SHA256 1230b50377ae4c3b5e6688c8a9bf8898bedfa9ff776e07afaf198109869d2930
MD5 b5fdbcc7186cbf2874d1160b34f33acc
BLAKE2b-256 3ef660f9aa4a75a83329c0d5dd908d53cd6607882b53103b77aa4636f8ad3eae

See more details on using hashes here.

File details

Details for the file mistral_dev-0.5.4a0-py3-none-any.whl.

File metadata

  • Download URL: mistral_dev-0.5.4a0-py3-none-any.whl
  • Upload date:
  • Size: 80.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.8.18 Linux/6.5.0-1022-azure

File hashes

Hashes for mistral_dev-0.5.4a0-py3-none-any.whl
Algorithm Hash digest
SHA256 4d81cd68fb4df2cf49ed0b1e4e07a7171528c2b4dd7afb0f32eafd082a7e92ec
MD5 96f39320804de5942dbef26bec93b708
BLAKE2b-256 f47217d7e2697b5faa7c139b59ec9ed79a097b7c14439b91f5aa9834bfd8264f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page