Skip to main content

A package for batch processing with OpenAI API.

Project description

# GPT Batcher

A simple tool to batch process messages using OpenAI's GPT models. `GPTBatcher` allows for efficient handling of multiple requests simultaneously, ensuring quick responses and robust error management.

## Installation

To get started with `GPTBatcher`, clone this repository to your local machine. Navigate to the repository directory and install the required dependencies (if any) by running:

```bash
pip install gpt_batch

Quick Start

To use GPTBatcher, you need to instantiate it with your OpenAI API key and the model name you wish to use. Here's a quick guide:

Handling Message Lists

This example demonstrates how to send a list of questions and receive answers:

from gpt_batch.batcher import GPTBatcher

# Initialize the batcher
batcher = GPTBatcher(api_key='your_key_here', model_name='gpt-3.5-turbo-1106')

# Send a list of messages and receive answers
result = batcher.handle_message_list(['question_1', 'question_2', 'question_3', 'question_4'])
print(result)
# Expected output: ["answer_1", "answer_2", "answer_3", "answer_4"]

Handling Embedding Lists

This example shows how to get embeddings for a list of strings:

from gpt_batch.batcher import GPTBatcher

# Reinitialize the batcher for embeddings
batcher = GPTBatcher(api_key='your_key_here', model_name='text-embedding-3-small')

# Send a list of strings and get their embeddings
result = batcher.handle_embedding_list(['question_1', 'question_2', 'question_3', 'question_4'])
print(result)
# Expected output: ["embedding_1", "embedding_2", "embedding_3", "embedding_4"]

Handling Message Lists with different API

This example demonstrates how to send a list of questions and receive answers with different api:

from gpt_batch.batcher import GPTBatcher

# Initialize the batcher
batcher = GPTBatcher(api_key='sk-', model_name='deepseek-chat',api_base_url="https://api.deepseek.com/v1")


# Send a list of messages and receive answers
result = batcher.handle_message_list(['question_1', 'question_2', 'question_3', 'question_4'])

# Expected output: ["answer_1", "answer_2", "answer_3", "answer_4"]

Configuration

The GPTBatcher class can be customized with several parameters to adjust its performance and behavior:

  • api_key (str): Your OpenAI API key.
  • model_name (str): Identifier for the GPT model version you want to use, default is 'gpt-3.5-turbo-1106'.
  • system_prompt (str): Initial text or question to seed the model, default is empty.
  • temperature (float): Adjusts the creativity of the responses, default is 1.
  • num_workers (int): Number of parallel workers for request handling, default is 64.
  • timeout_duration (int): Timeout for API responses in seconds, default is 60.
  • retry_attempts (int): How many times to retry a failed request, default is 2.
  • miss_index (list): Tracks indices of requests that failed to process correctly.

For more detailed documentation on the parameters and methods, refer to the class docstring.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gpt_batch-0.1.9.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gpt_batch-0.1.9-py3-none-any.whl (6.4 kB view details)

Uploaded Python 3

File details

Details for the file gpt_batch-0.1.9.tar.gz.

File metadata

  • Download URL: gpt_batch-0.1.9.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for gpt_batch-0.1.9.tar.gz
Algorithm Hash digest
SHA256 19f1ec1c15d595ba7e71fd4137f5a3257e9b61e0760c88f8800b1427a05654ea
MD5 9e71027f9c2263cae1ee2f4b5491fd89
BLAKE2b-256 d48e0730f910a4bf803022f78d5c2ada23a7c4b46fd92a50eb8560dbb914ded9

See more details on using hashes here.

File details

Details for the file gpt_batch-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: gpt_batch-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 6.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for gpt_batch-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 6eb85a261fc945513686895e5071c89d56fd28f62830fdd127fffa38a46e5751
MD5 9c60e206d2a80d59d0b9357d823437eb
BLAKE2b-256 94bd9f16469d0693b2c6b6602648b9358219c4ccef8ef091643ebcfb8c4b5fe6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page