A short wrapper of the OpenAI api call.
Project description
中文文档移步这里。
Openai API call
A Python wrapper for OpenAI API, supporting multi-turn dialogue, proxy, and asynchronous data processing.
Installation
pip install openai-api-call --upgrade
Usage
Set API Key and Base URL
Method 1, write in Python code:
import openai_api_call
openai_api_call.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
openai_api_call.base_url = "https://api.example.com"
Method 2, set environment variables in ~/.bashrc or ~/.zshrc:
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_BASE_URL="https://api.example.com"
Examples
Example 1, simulate multi-turn dialogue:
# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()
# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()
# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")
# save the chat history
chat.save("chat.json", mode="w") # default to "a"
# print the chat history
chat.print_log()
Example 2, process data in batch, and use a checkpoint file checkpoint:
# write a function to process the data
def msg2chat(msg):
chat = Chat(api_key=api_key)
chat.system("You are a helpful translator for numbers.")
chat.user(f"Please translate the digit to Roman numerals: {msg}")
chat.getresponse()
checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)
Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:
from openai_api_call import async_chat_completion, load_chats
langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")
License
This package is licensed under the MIT license. See the LICENSE file for more details.
update log
Current version 1.0.0 is a stable version, with the redundant feature function call removed, and the asynchronous processing tool added.
Beta version
- Since version
0.2.0,Chattype is used to handle data - Since version
0.3.0, you can use different API Key to send requests. - Since version
0.4.0, this package is mantained by cubenlp. - Since version
0.5.0, one can useprocess_chatsto process the data, with a customizedmsg2chatfunction and a checkpoint file. - Since version
0.6.0, the feature function call is added.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_api_call-1.4.0.tar.gz.
File metadata
- Download URL: openai_api_call-1.4.0.tar.gz
- Upload date:
- Size: 14.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8ac0997552d71b3715ba65498d3a5804620aae2d5d73a212bb536b84fc1e828a
|
|
| MD5 |
dc7402ecfc7b57c830162dc308beddda
|
|
| BLAKE2b-256 |
7f703300960de6ebaabc05ad2fd7203651a794e6096311bc9d7d5e10be3b5b44
|
File details
Details for the file openai_api_call-1.4.0-py3-none-any.whl.
File metadata
- Download URL: openai_api_call-1.4.0-py3-none-any.whl
- Upload date:
- Size: 16.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ed7a08109a82a4851d7084a79e76df0f1603e8e3c777b3ad695225a2099bdd84
|
|
| MD5 |
899dcdb403b247aebeb56c9776f088c1
|
|
| BLAKE2b-256 |
9609807d8bfc0b742275b0b67a91d177c6a747bf647bb58d34b8f71a9f839ce1
|