Skip to main content

Toolkit for Chat API

Project description

中文文档移步这里

ChatAPI Toolkit

PyPI version Tests Documentation Status Coverage

Toolkit for Chat API, supporting multi-turn dialogue, proxy, and asynchronous data processing.

Installation

pip install chattool --upgrade

Usage

Set API Key and Base URL

Set environment variables in ~/.bashrc or ~/.zshrc:

export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_API_BASE_URL="https://api.example.com"
export OPENAI_API_BASE="https://api.example.com/v1"

Examples

Example 1, simulate multi-turn dialogue:

# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()

Example 2, process data in batch, and use a checkpoint file checkpoint:

# write a function to process the data
def msg2chat(msg):
    chat = Chat()
    chat.system("You are a helpful translator for numbers.")
    chat.user(f"Please translate the digit to Roman numerals: {msg}")
    chat.getresponse()

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:

from chattool import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")

License

This package is licensed under the MIT license. See the LICENSE file for more details.

update log

Current version: 2.3.0. The features of function call, asynchronous processing, and finetuning are supported.

Beta version

  • Since version 0.2.0, Chat type is used to handle data
  • Since version 0.3.0, you can use different API Key to send requests.
  • Since version 0.4.0, this package is mantained by cubenlp.
  • Since version 0.5.0, one can use process_chats to process the data, with a customized msg2chat function and a checkpoint file.
  • Since version 0.6.0, the feature function call is added.
  • Since version 1.0.0, the feature function call is removed, and the asynchronous processing tool is added.
  • Since version 2.0.0, the package is renamed to chattool, and the asynchronous processing tool is improved.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chattool-2.3.0.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chattool-2.3.0-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file chattool-2.3.0.tar.gz.

File metadata

  • Download URL: chattool-2.3.0.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for chattool-2.3.0.tar.gz
Algorithm Hash digest
SHA256 838166a00e2732fd9a7cb9a42072574b24d82cc17c80f898704e8f57709a0e6a
MD5 79f8d8460aca22ea93dfdb014e748866
BLAKE2b-256 be1b019a1991e68a708cff17d6ed6771e43e0bbcfd87445c66f9744fd5093a19

See more details on using hashes here.

File details

Details for the file chattool-2.3.0-py3-none-any.whl.

File metadata

  • Download URL: chattool-2.3.0-py3-none-any.whl
  • Upload date:
  • Size: 21.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for chattool-2.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1fdd2256c0fd4ce4f0e75e576447615b0c0e27517a74f8f3250e60d3979498c5
MD5 a8d8dd2756027693f495e7ef132993e0
BLAKE2b-256 85703e2e11b4b2c3828e24d3b38454c29f93c08a55166b22842541dbbb62c0db

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page