OpenaiBatchAPI: A Python Library that supports OpenAI Batch API
Project description
Here’s a refactored version of the documentation with better split and output organization for readability.
OpenaiBatchAPI: A Python Library for OpenAI Batch API
OpenAI Batch API Documentation
Note: Currently supports only gpt-4o and gpt-4o-mini models.
Installation
Install the package from PyPI using pip:
$ pip install OpenaiBatchAPI
Usage Example
Basic Example
from openai_batch_api import OpenaiBatchAPI
# Initialize the Batch API client
batch_client = OpenaiBatchAPI(
api_key="API_TOKEN",
max_retry=3,
timeout=5 * 60 # Timeout in seconds
)
messages = []
# Add messages with custom IDs (e.g., "calc_{i}")
for i in range(7):
messages.append({
"id": f"calc_{i}",
"content": [
{
"role": "user",
"content": f"Calculate: 1 + {i} = "
}
]
})
# Add messages with auto-generated IDs (index-based)
for i in range(7):
messages.append({
"role": "user",
"content": f"Calculate: 1 + {i} = "
})
# Execute batch completion
batchs = batch_client.batchs_completion(
messages,
max_completion_tokens=32,
model="gpt-4o-mini",
temperature=0,
seed=42,
batch_size=2
)
# Print batch outputs
for batch in batchs:
print("BATCH ID:", batch.id)
print("FILE ID:", batch.output_file.id)
for content in batch.output_file.contents:
print("Message ID:", content["id"])
print("Response:", content["choices"][0]['message']['content'])
print()
Output Sample
The script will output results similar to this:
Preparing: 100%|███████████████████████| 4/4 [00:10<00:00, 2.74s/it]
Sending: 100%|█████████████████████████| 4/4 [00:02<00:00, 1.60it/s]
Processing: 100%|██████████████████████| 4/4 [00:12<00:00, 3.10s/it]
BATCH ID: batch_672b237c62f88190b0d69cbf789eb754
FILE ID: file-IlJUOenuE8p7okKB3TX2ANKW
Message ID: calc_0
Response: 1 + 0 = 1
Message ID: calc_1
Response: 1 + 1 = 2.
BATCH ID: batch_672b237ce4fc8190813e7b1239f0336d
FILE ID: file-hTFNk0uBC37plDaWtgYtD0Dr
Message ID: calc_2
Response: 1 + 2 = 3.
Message ID: calc_3
Response: 1 + 3 = 4.
BATCH ID: batch_672b237d937c8190a2ec942aa9401732
FILE ID: file-AbzBSJGl2pPU1B6KPmMuIDBB
Message ID: calc_4
Response: 1 + 4 = 5.
Message ID: calc_5
Response: 1 + 5 = 6.
BATCH ID: batch_672b237e076c8190807819fde4cdd093
FILE ID: file-jRAePmaheNSNAWYcawdmSgYF
Message ID: calc_6
Response: 1 + 6 = 7.
Usage Statistics
The print_usage_table method outputs a table summarizing token usage and cost.
batch_client.print_usage_table(digits=8)
This outputs:
+------------+--------+-------------+
| Category | Tokens | Price |
+------------+--------+-------------+
| Prompt | 119 | $0.00000892 |
| Completion | 55 | $0.00001650 |
+------------+--------+-------------+
| Total | 174 | $0.00002543 |
+------------+--------+-------------+
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file OpenaiBatchAPI-1.1.0-py2.py3-none-any.whl.
File metadata
- Download URL: OpenaiBatchAPI-1.1.0-py2.py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4ce51017cdbf1a76aed004227e49e8bd662ea107470171ac38831c7e95578bac
|
|
| MD5 |
7aaab3148ad45161adadbcb91bd3946d
|
|
| BLAKE2b-256 |
f115f7343343e076376866fe28245726b24b3f87eef91b537b0acbe5f23bd78d
|