Skip to main content

OpenAI async API with client side timeout, retry with exponential backoff and connection reuse

Project description

OpenAI client with client timeout and parallel processing

Quick Install

pip install openai-async-client

🤔 What is this?

This library is aimed at assisting with OpenAI API usage by:

Support for client side timeouts with retry and backoff for completions.

Support for concurrent processing with pandas DataFrames.

Example of chat completion with client timeout of 1 second to connect and 10 seconds to read with a maximum of 3 retries.

import os
from httpx import Timeout
from openai_async_client import OpenAIAsync, ChatRequest, Message, SystemMessage

client = OpenAIAsync(api_key=os.environ['OPENAI_API_KEY'])

messages = [
    Message(
        role="user",
        content=f"Hello ChatGPT, Give a brief overview of the book Frankenstein by Mary Shelley.",
    )
]

response = client.chat_completion(request=ChatRequest(messages=messages),client_timeout=Timeout(1.0,read=10.0),retries=3)

Example of concurrent processing a DataFrame for chat completions with 4 concurrent connections.

import os
from httpx import Timeout
from openai_async_client import OpenAIAsync, ChatRequest, Message, SystemMessage
import uuid
import pandas as pd

[//]: # (Example DataFrame)
TEST_INPUTS = [
   "the open society and its enemies by Karl Popper",
   "Das Capital by Karl Marx",
   "Pride and Prejudice by Jane Austen",
   "Frankenstein by Mary Shelley",
   "Moby Dick by  Herman Melville",
]

records = [
   {"user_id": i, "book_id": str(uuid.uuid4())[:6], "book_name": s}
   for i, s in enumerate(TEST_INPUTS)
]
input_df = pd.DataFrame.from_records(records)


client = OpenAIAsync(api_key=os.environ['OPENAI_API_KEY'])

messages = [
   Message(
       role="user",
       content=f"Hello ChatGPT, Give a brief overview of the book Frankenstein by Mary Shelley.",
   )
]

[//]: # (Define a mapping function from row to prompt)
def my_prompt_fn(r: pd.Series) -> ChatRequest:
   message = Message(
       role="user",
       content=f"Hello ChatGPT, Give a brief overview of the book {r.book_name}.",
   )

[//]: # (key Dict is mandatory since results order is NOT guaranteed!)
   key = {"user_id": r.user_id, "book_id": r.book_id}
   return ChatRequest(
       key=key,
       messages=[message],
       system=SystemMessage(content="Assistant is providing book reviews"),
   )

[//]: # (parallel process the DataFrame making up to 4 concurrent requests to OpenAI endpoint)
result_df = client.chat_completions(df=input_df, request_fn=my_prompt_fn,max_connections=4)

[//]: # (result_df columns contains 'openai_reply' and 'api_error' columns.

Default Response Extraction

By default, only the "assistant" message (or messages if n>1) would be returned, but you can implement a custom ResponseProcessor

class ResponseProcessor(Generic[R], Callable[..., R], ABC):
    @abstractmethod
    def __call__(self, json: str, *args: Any, **kwargs: Any) -> R:
        pass

Disclaimer

This repository has no connection whatsoever to OpenAI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai-async-client-0.1.7.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

openai_async_client-0.1.7-py3-none-any.whl (16.6 kB view details)

Uploaded Python 3

File details

Details for the file openai-async-client-0.1.7.tar.gz.

File metadata

  • Download URL: openai-async-client-0.1.7.tar.gz
  • Upload date:
  • Size: 5.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for openai-async-client-0.1.7.tar.gz
Algorithm Hash digest
SHA256 c02d7e1bc71d52e8fb99c19eb3e2ac49427deb37a8addfa9edeca684a64feb18
MD5 9beb08113826f0a655bcd255186c9b4e
BLAKE2b-256 d06feadda92d876544a4b716c07f76563a5d98e702a55013f6ff3c0201f32aef

See more details on using hashes here.

File details

Details for the file openai_async_client-0.1.7-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_async_client-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 06e4255b775eb6309c3532450b56621a8f15251abc1c34efced3fc5cb89683cc
MD5 c3b5c0ed00c59d4eb0e2bbcf5f64e399
BLAKE2b-256 139adec0a503e1c1f328f4df5c84a7cb3beca3d171dde67619b735312c450e7c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page