Skip to main content

A short wrapper of the OpenAI api call.

Project description

中文文档移步这里

Openai API call

PyPI version Tests Documentation Status

A simple wrapper for OpenAI API, which can send prompt message and return response.

Installation

pip install openai-api-call --upgrade

Note: Since version 0.2.0, Chat type is used to handle data, which is not compatible with previous versions.

Usage

Set API Key

import openai_api_call
openai_api_call.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

Or set OPENAI_API_KEY in ~/.bashrc to automatically set it when you start the terminal:

# Add the following code to ~/.bashrc
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

Set Proxy (Optional)

from openai_api_call import proxy_on, proxy_off, proxy_status
# Check the current proxy
proxy_status()

# Set proxy(example)
proxy_on(http="127.0.0.1:7890", https="127.0.0.1:7890")

# Check the updated proxy
proxy_status()

# Turn off proxy
proxy_off() 

Basic Usage

Example 1, send prompt and return information:

from openai_api_call import Chat, show_apikey

# Check if API key is set
show_apikey()

# Check if proxy is enabled
proxy_status()

# Send prompt and return response
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse(update=False) # Not update the chat history, default to True

Example 2, customize the message template and return the information and the number of consumed tokens:

import openai_api_call

# Customize the sending template
openai_api_call.default_prompt = lambda msg: [
    {"role": "system", "content": "帮我翻译这段文字"},
    {"role": "user", "content": msg}
]
chat = Chat("Hello!")
# Set the number of retries to Inf
# The timeout for each request is 10 seconds
response = chat.getresponse(temperature=0.5, max_requests=-1, timeout=10)
print("Number of consumed tokens: ", response.total_tokens)
print("Returned content: ", response.content)

Advanced Usage

Continue chatting based on the last response:

# first call
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse() # update chat history, default is True
print(resp.content)

# continue chatting
chat.user("How are you?")
next_resp = chat.getresponse()
print(next_resp.content)

# fake response
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# print chat history
chat.print_log()

License

This package is licensed under the MIT license. See the LICENSE file for more details.

update log

  • Since version 0.2.0, Chat type is used to handle data
  • Since version 0.3.0, you can use different API Key to send requests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_api_call-0.3.0.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

openai_api_call-0.3.0-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file openai_api_call-0.3.0.tar.gz.

File metadata

  • Download URL: openai_api_call-0.3.0.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for openai_api_call-0.3.0.tar.gz
Algorithm Hash digest
SHA256 9a95af968539e106b27ea7771aef1de248a34dd2536dbc108b44cebe93e02749
MD5 10875dc39b83041b2b8de2aac6bd9fc5
BLAKE2b-256 7b1b9d265d68436980da0b120d766564ab09cdb64b9e6c7022ccedd89d89735b

See more details on using hashes here.

File details

Details for the file openai_api_call-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_api_call-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3d9c88fdd4e55ff3c8c22ae81e9a91d69f1496c231f3987094903f72ca5853e0
MD5 ba96f1c1905576de3df279367aecc4cb
BLAKE2b-256 1f56d36cf1d9936178e47ebe1082d5f5a12c9b1f22f782ebe448a030073db4f9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page