A short wrapper of the Chatapi Toolkit.
Project description
中文文档移步这里。
Chatapi Toolkit
A Python wrapper for ChatAPI Toolkit, supporting multi-turn dialogue, proxy, and asynchronous data processing.
Installation
pip install chatapi-toolkit --upgrade
Usage
Set API Key and Base URL
Method 1, write in Python code:
import chatapi_toolkit
chatapi_toolkit.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
chatapi_toolkit.base_url = "https://api.example.com"
Method 2, set environment variables in ~/.bashrc
or ~/.zshrc
:
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_BASE_URL="https://api.example.com"
Examples
Example 1, simulate multi-turn dialogue:
# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()
# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()
# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")
# save the chat history
chat.save("chat.json", mode="w") # default to "a"
# print the chat history
chat.print_log()
Example 2, process data in batch, and use a checkpoint file checkpoint
:
# write a function to process the data
def msg2chat(msg):
chat = Chat(api_key=api_key)
chat.system("You are a helpful translator for numbers.")
chat.user(f"Please translate the digit to Roman numerals: {msg}")
chat.getresponse()
checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)
Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:
from chatapi_toolkit import async_chat_completion, load_chats
langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")
License
This package is licensed under the MIT license. See the LICENSE file for more details.
update log
Current version 1.0.0
is a stable version, with the redundant feature function call
removed, and the asynchronous processing tool added.
Beta version
- Since version
0.2.0
,Chat
type is used to handle data - Since version
0.3.0
, you can use different API Key to send requests. - Since version
0.4.0
, this package is mantained by cubenlp. - Since version
0.5.0
, one can useprocess_chats
to process the data, with a customizedmsg2chat
function and a checkpoint file. - Since version
0.6.0
, the feature function call is added.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for chatapi_toolkit-2.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b80d7ce6995837407d6da7d1646fcaee52dc365f63ebb6570801cdf34c413abf |
|
MD5 | 24c9bce719a41a2522b1a91bad32453e |
|
BLAKE2b-256 | b59b506b1865af4db360614bdee3752050872a50e0437dcc1d61486a4aeb6ffc |