Skip to main content

Toolkit for Chat API

Project description

中文文档移步这里

ChatAPI Toolkit

PyPI version Tests Documentation Status Coverage

Toolkit for Chat API, supporting multi-turn dialogue, proxy, and asynchronous data processing.

Installation

pip install chattool --upgrade

Usage

Set API Key and Base URL

Set environment variables in ~/.bashrc or ~/.zshrc:

export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_API_BASE_URL="https://api.example.com"
export OPENAI_API_BASE="https://api.example.com/v1"

Examples

Example 1, simulate multi-turn dialogue:

# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()

Example 2, process data in batch, and use a checkpoint file checkpoint:

# write a function to process the data
def msg2chat(msg):
    chat = Chat()
    chat.system("You are a helpful translator for numbers.")
    chat.user(f"Please translate the digit to Roman numerals: {msg}")
    chat.getresponse()

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:

from chattool import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")

License

This package is licensed under the MIT license. See the LICENSE file for more details.

update log

Current version: 2.3.0. The features of function call, asynchronous processing, and finetuning are supported.

Beta version

  • Since version 0.2.0, Chat type is used to handle data
  • Since version 0.3.0, you can use different API Key to send requests.
  • Since version 0.4.0, this package is mantained by cubenlp.
  • Since version 0.5.0, one can use process_chats to process the data, with a customized msg2chat function and a checkpoint file.
  • Since version 0.6.0, the feature function call is added.
  • Since version 1.0.0, the feature function call is removed, and the asynchronous processing tool is added.
  • Since version 2.0.0, the package is renamed to chattool, and the asynchronous processing tool is improved.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chattool-2.3.2.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chattool-2.3.2-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file chattool-2.3.2.tar.gz.

File metadata

  • Download URL: chattool-2.3.2.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for chattool-2.3.2.tar.gz
Algorithm Hash digest
SHA256 371c986dd47eae18559d1e40e2288ded7c5d6cd0253658d4c270d5a4b7707c1e
MD5 c25e14a4b313e955c490b77131bed576
BLAKE2b-256 58a8dc1875c68f2f7e3646cf0c0e2a7963a219d5053413619a9d76ad9555b826

See more details on using hashes here.

File details

Details for the file chattool-2.3.2-py3-none-any.whl.

File metadata

  • Download URL: chattool-2.3.2-py3-none-any.whl
  • Upload date:
  • Size: 21.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for chattool-2.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 70c241bc0fadeb4190ef2c4a8e7642cf8928923b829a8a18c059ccadeeeb9a99
MD5 f6f6e530340270f065723044d0fa3f4b
BLAKE2b-256 b03b16bf8c356237d240ce7bfbbe1075e08965414a58bf1b83db383de15b16ce

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page