Skip to main content

A short wrapper of the OpenAI api call.

Project description

中文文档移步这里

Openai API call

PyPI version Tests Documentation Status Coverage

A Python wrapper for OpenAI API, supporting multi-turn dialogue, proxy, and asynchronous data processing.

Installation

pip install openai-api-call --upgrade

Usage

Set API Key and Base URL

Method 1, write in Python code:

import openai_api_call
openai_api_call.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
openai_api_call.base_url = "https://api.example.com"

Method 2, set environment variables in ~/.bashrc or ~/.zshrc:

export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_BASE_URL="https://api.example.com"

Examples

Example 1, simulate multi-turn dialogue:

# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()

Example 2, process data in batch, and use a checkpoint file checkpoint:

# write a function to process the data
def msg2chat(msg):
    chat = Chat(api_key=api_key)
    chat.system("You are a helpful translator for numbers.")
    chat.user(f"Please translate the digit to Roman numerals: {msg}")
    chat.getresponse()

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:

from openai_api_call import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
chatlogs = [
    [{"role":"user", "content":"print hello using %s" % lang}]
    for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")

License

This package is licensed under the MIT license. See the LICENSE file for more details.

update log

Current version 1.0.0 is a stable version, with the redundant feature function call removed, and the asynchronous data processing tool added.

Beta version

  • Since version 0.2.0, Chat type is used to handle data
  • Since version 0.3.0, you can use different API Key to send requests.
  • Since version 0.4.0, this package is mantained by cubenlp.
  • Since version 0.5.0, one can use process_chats to process the data, with a customized msg2chat function and a checkpoint file.
  • Since version 0.6.0, the feature function call is added.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_api_call-1.2.0.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_api_call-1.2.0-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file openai_api_call-1.2.0.tar.gz.

File metadata

  • Download URL: openai_api_call-1.2.0.tar.gz
  • Upload date:
  • Size: 13.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for openai_api_call-1.2.0.tar.gz
Algorithm Hash digest
SHA256 fa191fad9b59247a28f6929d163b4f7e305afe113bc688c203e5f9da252f24ee
MD5 3aacbf4ff14e30f89ca61f6f958727f0
BLAKE2b-256 61f7b6955342268f0a4030f87b58c4ae1565b0111e1c450b35d2367ef7d19705

See more details on using hashes here.

File details

Details for the file openai_api_call-1.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_api_call-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e12ce4a70d1ac88142cf36a3ae07821b524fd384a4c85a5747a1398587316080
MD5 f3f44fe6f84fe5c5cebd7ee7887156c6
BLAKE2b-256 7dc3b6fe5dffa4255d763d04b24f96295ebcf6cae23ef76ddb42c01202aed3f1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page