No project description provided
Project description
ChatCompletion Utils
ChatCompletion Utils is a Python library that provides utility functions to very easily interact with OpenAI's Chat Completion API. It enables you to easily generate chat-friendly prompts, count tokens, and auto-select which model to use based on context length for the chat completion API endpoint. It also has a few other bells and whistles ;)
Installation
Ensure you have installed the required packages:
poetry install
Usage
First, set your OpenAI API key in the environment:
export OPENAI_API_KEY=<your_api_key>
Then, you can use the functions provided in the library:
from chat_completion_utils import (
llm,
_code_prompt,
num_tokens_from_messages,
select_model,
buid_prompt
)
# Generate a 'ChatCompletion' response with the OpenAI API
book_summary = llm(
"Provide a brief summary of the following book series.", # 'system' instruction
"Harry Potter series.", # 'user' prompt
0.5 # model temperature
)
print(book_summary)
# Use 'prompt partials' (e.g. `_code_prompt()`) to add pre-defined protective language to your prompts
web_app = llm(
_code_prompt(),
"geneate a React web application framework"
)
# Calculate tokens in a list of messages
## - `llm()` bundles this functionality
messages = [
{"role": "system", "content": "Translate the following English to French" },
{"role": "user", "content": "Hello, how are you?" }
]
token_count = num_tokens_from_messages(messages, model="gpt-4")
print(token_count)
# Select the appropriate model to use based on token count
## - `llm()` bundles this functionality
## - auto switch to 'gpt-4-32k' if you need to, otherwise goes with the cheaper 'gpt-4' (or 'gpt-3.5-turbo' if you ask it to)
selected_model = select_model(messages)
print(selected_model)
# Construct prompt objects
## - `llm()` bundles this functionality
## - You shouldn't need to use this, but.. maybe I'm wrong. Go wild!
prompt = build_prompt(
system_content=_code_prompt("Generate Ruby code for the given user prompt"),
user_content="function to compute a factorial."
)
print(prompt)
Functions
llm()
Generate a chat-based completion using the OpenAI API.
Arguments
- system_instruction (str): The system instruction.
- user_input (str): The user input.
- temp (int): The temperature for controlling randomness of the output.
Returns
(str): The response from the model.
_code_prompt()
Generate a code-only prompt.
Arguments
- prompt (str): The base prompt to modify.
Returns
(str): The modified prompt for code-only output.
num_tokens_from_messages()
Count the number of tokens used by a list of messages.
Arguments
- messages (list): A list of messages.
- model (str): The model name.
Returns
(int): The number of tokens used by the list of messages.
select_model()
Select the appropriate model based on token count.
Arguments
- messages (list): A list of messages.
- model_family (str): The model family (default: 'gpt-4').
- force (bool): Force the use of the specified model family if the token count is within limits.
Returns
(str): The selected model name.
buid_prompt()
Build a list of messages to use as input for the OpenAI API.
Arguments
- system_content (str): The content for the system message.
- user_content (str): The content for the user message.
- messages (list): An optional list of existing messages.
Returns
(list): A list of messages to be used as input for the OpenAI API.
Constants
The MODELS constant is a dictionary containing information about the supported models and their properties, such as the maximum number of tokens allowed.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chat_completion_utils-1.2.5.tar.gz
.
File metadata
- Download URL: chat_completion_utils-1.2.5.tar.gz
- Upload date:
- Size: 3.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.10.5 Darwin/21.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9326b3582f72e259fe7dfbc2b7288a83123e70f630563a065363bf2cda7a3720 |
|
MD5 | a357be4505f991fc3eedbf0bc390b947 |
|
BLAKE2b-256 | 7d87f2922ab37a3706b61f281d77b7738931cf4c190b50130d0a3196c573f1a6 |
File details
Details for the file chat_completion_utils-1.2.5-py3-none-any.whl
.
File metadata
- Download URL: chat_completion_utils-1.2.5-py3-none-any.whl
- Upload date:
- Size: 4.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.10.5 Darwin/21.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ca89967e3dbe0dfd48bfb523c659d4242dc4b3721fd86388460942b62f874ef0 |
|
MD5 | bd89a90d48833a5a700d5f914d44a6b4 |
|
BLAKE2b-256 | 238c0ca3bd3b8809ffa24bcbc422ce93371aeaf59575d29d73248c722481c6f2 |