Skip to main content

OpenAI async API with client side timeout, retry with exponential backoff and connection reuse

Project description

OpenAI client with client timeout and parallel processing

Quick Install

pip install openai-async-client

🤔 What is this?

This library is aimed at assisting with OpenAI API usage by:

Support for client side timeouts with retry and backoff for completions.

Support for concurrent processing with pandas DataFrames.

Example of chat completion with client timeout of 1 second to connect and 10 seconds to read with a maximum of 3 retries.

import os
from httpx import Timeout
from openai_async_client import OpenAIAsync, ChatRequest, Message, SystemMessage

client = OpenAIAsync(api_key=os.environ['OPENAI_API_KEY'])

messages = [
    Message(
        role="user",
        content=f"Hello ChatGPT, Give a brief overview of the book Frankenstein by Mary Shelley.",
    )
]

response = client.chat_completion(request=ChatRequest(messages=messages),client_timeout=Timeout(1.0,read=10.0),retries=3)

Example of concurrent processing a DataFrame for chat completions with 4 concurrent connections.

import os
from httpx import Timeout
from openai_async_client import OpenAIAsync, ChatRequest, Message, SystemMessage
import uuid
import pandas as pd

[//]: # (Example DataFrame)
TEST_INPUTS = [
   "the open society and its enemies by Karl Popper",
   "Das Capital by Karl Marx",
   "Pride and Prejudice by Jane Austen",
   "Frankenstein by Mary Shelley",
   "Moby Dick by  Herman Melville",
]

records = [
   {"user_id": i, "book_id": str(uuid.uuid4())[:6], "book_name": s}
   for i, s in enumerate(TEST_INPUTS)
]
input_df = pd.DataFrame.from_records(records)


client = OpenAIAsync(api_key=os.environ['OPENAI_API_KEY'])

messages = [
   Message(
       role="user",
       content=f"Hello ChatGPT, Give a brief overview of the book Frankenstein by Mary Shelley.",
   )
]

[//]: # (Define a mapping function from row to prompt)
def my_prompt_fn(r: pd.Series) -> ChatRequest:
   message = Message(
       role="user",
       content=f"Hello ChatGPT, Give a brief overview of the book {r.book_name}.",
   )

[//]: # (key Dict is mandatory since results order is NOT guaranteed!)
   key = {"user_id": r.user_id, "book_id": r.book_id}
   return ChatRequest(
       key=key,
       messages=[message],
       system=SystemMessage(content="Assistant is providing book reviews"),
   )

[//]: # (parallel process the DataFrame making up to 4 concurrent requests to OpenAI endpoint)
result_df = client.chat_completions(df=input_df, request_fn=my_prompt_fn,max_connections=4)

[//]: # (result_df columns contains 'openai_reply' and 'api_error' columns.

Default Response Extraction

By default, only the "assistant" message (or messages if n>1) would be returned, but you can implement a custom ResponseProcessor

class ResponseProcessor(Generic[R], Callable[..., R], ABC):
    @abstractmethod
    def __call__(self, json: str, *args: Any, **kwargs: Any) -> R:
        pass

Disclaimer

This repository has no connection whatsoever to OpenAI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai-async-client-0.1.8.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

openai_async_client-0.1.8-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file openai-async-client-0.1.8.tar.gz.

File metadata

  • Download URL: openai-async-client-0.1.8.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for openai-async-client-0.1.8.tar.gz
Algorithm Hash digest
SHA256 d00f27f990833e18e4fcc03e1b6e95658516ecc8abd889782bc11a4e9ea00c7b
MD5 c3f9a7086aeba2d6b9582aae51affe37
BLAKE2b-256 4367ae0f737c618223cba3265ae6c8d24b277a3b657dcde9566731d1981c26da

See more details on using hashes here.

File details

Details for the file openai_async_client-0.1.8-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_async_client-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 c3a6dcc602a538862c840a9990186ddf0574c7795825c3d86a5059f03adc816a
MD5 dc2a2589f11170d0215f8fd125949267
BLAKE2b-256 9ce849995f6907aa8484a2a2a2e4e620baf03e26972cbf1301f905451b06cad5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page