Skip to main content

Python Client SDK Generated by Speakeasy.

Project description

Friendli Python SDK

Friendli Logo

Supercharge Generative AI Serving with Friendli 🚀

Token Setup

When using Friendli Python SDK, you need to provide a Friendli Token for authentication and authorization purposes. A Friendli Token serves as an alternative method of authorization to signing in with an email and a password. You can generate a new Friendli Token through the Friendli Suite, at your "User settings" page by following the steps below.

  1. Go to the Friendli Suite and sign in with your account.
  2. Click the profile icon at the top-right corner of the page.
  3. Click "User settings" menu.
  4. Go to the "Tokens" tab on the navigation bar.
  5. Create a new Friendli Token by clicking the "Create token" button.
  6. Copy the token and save it in a safe place. You will not be able to see this token again once the page is refreshed.

Table of Contents

SDK Installation

The SDK can be installed with either pip or poetry package managers.

PIP

PIP is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.

pip install friendli

Poetry

Poetry is a modern tool that simplifies dependency management and package publishing by using a single pyproject.toml file to handle project metadata and dependencies.

poetry add friendli

SDK Example Usage

Chat completions

Given a list of messages forming a conversation, the model generates a response.

# Synchronous Example
from friendli import SyncFriendli
import os

with SyncFriendli(
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = s.serverless.chat.complete(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant.",
            },
            {
                "role": "user",
                "content": "Hello!",
            },
        ],
        max_tokens=200,
    )

    print(res)

The same SDK client can also be used to make asychronous requests by importing asyncio.

# Asynchronous Example
import asyncio
from friendli import AsyncFriendli
import os


async def main():
    async with AsyncFriendli(
        token=os.getenv("FRIENDLI_TOKEN", ""),
    ) as s:
        res = await s.serverless.chat.complete_async(
            model="meta-llama-3.1-8b-instruct",
            messages=[
                {
                    "role": "system",
                    "content": "You are a helpful assistant.",
                },
                {
                    "role": "user",
                    "content": "Hello!",
                },
            ],
            max_tokens=200,
        )

        print(res)

asyncio.run(main())

Tool assisted chat completions

Given a list of messages forming a conversation, the model generates a response. Additionally, the model can utilize built-in tools for tool calls, enhancing its capability to provide more comprehensive and actionable responses.

# Synchronous Example
from friendli import SyncFriendli
import os

with SyncFriendli(
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = s.serverless.tool_assisted_chat.complete(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "user",
                "content": "What is 3 + 6?",
            },
        ],
        max_tokens=200,
        tools=[
            {
                "type": "math:calculator",
            },
        ],
    )

    print(res)

The same SDK client can also be used to make asychronous requests by importing asyncio.

# Asynchronous Example
import asyncio
from friendli import AsyncFriendli
import os


async def main():
    async with AsyncFriendli(
        token=os.getenv("FRIENDLI_TOKEN", ""),
    ) as s:
        res = await s.serverless.tool_assisted_chat.complete_async(
            model="meta-llama-3.1-8b-instruct",
            messages=[
                {
                    "role": "user",
                    "content": "What is 3 + 6?",
                },
            ],
            max_tokens=200,
            tools=[
                {
                    "type": "math:calculator",
                },
            ],
        )

        print(res)

asyncio.run(main())

Available Resources and Operations

Available methods

dedicated

dedicated.chat

dedicated.completions

dedicated.token

serverless

serverless.chat

serverless.completions

serverless.token

serverless.tool_assisted_chat

  • complete - Tool assisted chat completions
  • stream - Stream tool assisted chat completions

Server-sent event streaming

Server-sent events are used to stream content from certain operations. These operations will expose the stream as Generator that can be consumed using a simple for loop. The loop will terminate when the server no longer has any events to send and closes the underlying connection.

The stream is also a Context Manager and can be used with the with statement and will close the underlying connection when the context is exited.

from friendli import SyncFriendli
import os

with SyncFriendli(
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = s.serverless.chat.stream(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant.",
            },
            {
                "role": "user",
                "content": "Hello!",
            },
        ],
        max_tokens=200,
    )

    if res is not None:
        with res as event_stream:
            for event in event_stream:
                # handle event
                print(event, flush=True)

Retries

Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.

To change the default retry strategy for a single API call, simply provide a RetryConfig object to the call:

from friendli import SyncFriendli
from friendli.utils import BackoffStrategy, RetryConfig
import os

with SyncFriendli(
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = s.serverless.chat.complete(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant.",
            },
            {
                "role": "user",
                "content": "Hello!",
            },
        ],
        max_tokens=200,
        retries=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
    )

    print(res)

If you'd like to override the default retry strategy for all operations that support retries, you can use the retry_config optional parameter when initializing the SDK:

from friendli import SyncFriendli
from friendli.utils import BackoffStrategy, RetryConfig
import os

with SyncFriendli(
    retry_config=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = s.serverless.chat.complete(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant.",
            },
            {
                "role": "user",
                "content": "Hello!",
            },
        ],
        max_tokens=200,
    )

    print(res)

Error Handling

Handling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.

By default, an API error will raise a models.SDKError exception, which has the following properties:

Property Type Description
.status_code int The HTTP status code
.message str The error message
.raw_response httpx.Response The raw HTTP response
.body str The response content

When custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective Errors tables in SDK docs for more details on possible exception types for each operation. For example, the complete_async method may raise the following exceptions:

Error Type Status Code Content Type
models.SDKError 4XX, 5XX */*

Example

from friendli import SyncFriendli, models
import os

with SyncFriendli(
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = None
    try:
        res = s.serverless.chat.complete(
            model="meta-llama-3.1-8b-instruct",
            messages=[
                {
                    "role": "system",
                    "content": "You are a helpful assistant.",
                },
                {
                    "role": "user",
                    "content": "Hello!",
                },
            ],
            max_tokens=200,
        )

        print(res)

    except models.SDKError as e:
        # handle exception
        raise (e)

Server Selection

Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the server_url: str optional parameter when initializing the SDK client instance. For example:

from friendli import SyncFriendli
import os

with SyncFriendli(
    server_url="https://api.friendli.ai",
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as s:
    res = s.serverless.chat.complete(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant.",
            },
            {
                "role": "user",
                "content": "Hello!",
            },
        ],
        max_tokens=200,
    )

    print(res)

Custom HTTP Client

The Python SDK makes API calls using the httpx HTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance. Depending on whether you are using the sync or async version of the SDK, you can pass an instance of HttpClient or AsyncHttpClient respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls. This allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of httpx.Client or httpx.AsyncClient directly.

For example, you could specify a header for every request that this sdk makes as follows:

from friendli import SyncFriendli
import httpx

http_client = httpx.Client(headers={"x-custom-header": "someValue"})
s = SyncFriendli(client=http_client)

or you could wrap the client with your own custom logic:

from friendli import AsyncFriendli
from friendli.httpclient import AsyncHttpClient
import httpx
from typing import Any, Optional, Union


class CustomClient(AsyncHttpClient):
    client: AsyncHttpClient

    def __init__(self, client: AsyncHttpClient):
        self.client = client

    async def send(
        self,
        request: httpx.Request,
        *,
        stream: bool = False,
        auth: Union[
            httpx._types.AuthTypes, httpx._client.UseClientDefault, None
        ] = httpx.USE_CLIENT_DEFAULT,
        follow_redirects: Union[
            bool, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
    ) -> httpx.Response:
        request.headers["Client-Level-Header"] = "added by client"

        return await self.client.send(
            request, stream=stream, auth=auth, follow_redirects=follow_redirects
        )

    def build_request(
        self,
        method: str,
        url: httpx._types.URLTypes,
        *,
        content: Optional[httpx._types.RequestContent] = None,
        data: Optional[httpx._types.RequestData] = None,
        files: Optional[httpx._types.RequestFiles] = None,
        json: Optional[Any] = None,
        params: Optional[httpx._types.QueryParamTypes] = None,
        headers: Optional[httpx._types.HeaderTypes] = None,
        cookies: Optional[httpx._types.CookieTypes] = None,
        timeout: Union[
            httpx._types.TimeoutTypes, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
        extensions: Optional[httpx._types.RequestExtensions] = None,
    ) -> httpx.Request:
        return self.client.build_request(
            method,
            url,
            content=content,
            data=data,
            files=files,
            json=json,
            params=params,
            headers=headers,
            cookies=cookies,
            timeout=timeout,
            extensions=extensions,
        )


s = AsyncFriendli(async_client=CustomClient(httpx.AsyncClient()))

Debugging

You can setup your SDK to emit debug logs for SDK requests and responses.

You can pass your own logger class directly into your SDK.

from friendli import SyncFriendli
import logging

logging.basicConfig(level=logging.DEBUG)
s = SyncFriendli(debug_logger=logging.getLogger("friendli"))

You can also enable a default debug logger by setting an environment variable FRIENDLI_DEBUG to true.

IDE Support

PyCharm

Generally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

friendli-0.3.16.tar.gz (80.2 kB view details)

Uploaded Source

Built Distribution

friendli-0.3.16-py3-none-any.whl (144.4 kB view details)

Uploaded Python 3

File details

Details for the file friendli-0.3.16.tar.gz.

File metadata

  • Download URL: friendli-0.3.16.tar.gz
  • Upload date:
  • Size: 80.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.8.18 Linux/6.5.0-1025-azure

File hashes

Hashes for friendli-0.3.16.tar.gz
Algorithm Hash digest
SHA256 6a92fac757a424190190cdd46108ec299fe1725226208ae7338d5e5a9429c47b
MD5 03c46464867114d211d85b047ed1048c
BLAKE2b-256 56696f414539f52e6870ae5690ce78af792c5219c1659bca53b4b8c3fcddec4f

See more details on using hashes here.

File details

Details for the file friendli-0.3.16-py3-none-any.whl.

File metadata

  • Download URL: friendli-0.3.16-py3-none-any.whl
  • Upload date:
  • Size: 144.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.8.18 Linux/6.5.0-1025-azure

File hashes

Hashes for friendli-0.3.16-py3-none-any.whl
Algorithm Hash digest
SHA256 69708878e8592054875fb448d247c30562c0e7a430f8e1a6baced0095fcb8355
MD5 493b1725cf909323523f0d28be6e2d42
BLAKE2b-256 ab8fa15616e7a723b427f08cd814ce5fbf60dd958e41f49a18ea036d01126916

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page