A small example package
Project description
gpt-to-chatgpt-py
Convert a regular GPT call into a ChatGPT call
TYPESCRIPT VERSION HERE -> https://github.com/bramses/gpt-to-chatgpt-ts
Functions
toChatML()
Converts a string message into a Chat Markup Language (ChatML) object.
Usage
toChatML(message:str, options:Optional[dict]=None) -> List[dict]
Arguments
- message: The string message to be converted to ChatML.
- options (optional): A dictionary that can contain the following keys:
- system_messages: A list of strings that represent system messages to be added to the ChatML object.
- role: The role of the message (either Role.USER or Role.ASSISTANT).
Examples
toChatML('hello')
# Output: [{'role': Role.USER, 'content': 'hello'}]
toChatML('hello', {'system_messages': ['hi'], 'role': Role.ASSISTANT})
# Output: [{'role': Role.SYSTEM, 'content': 'hi'}, {'role': Role.ASSISTANT, 'content': 'hello'}]
get_message()
Extracts the message content from a response object.
Usage
get_message(response: dict, options: Optional[dict] = None) -> Union[str, dict]
Arguments
- response: The response object from which to extract the message content.
- options (optional): A dictionary that can contain the following keys:
- usage: A boolean value that indicates whether to return the usage information of the response.
- role: A boolean value that indicates whether to return the role of the message.
- isMessages: A boolean value that indicates whether to return the message as a list of messages.
Examples
get_message(test_response)
# Output: 'The 2020 World Series was played in Arlington, Texas at the Globe Life Field, which was the new home stadium for the Texas Rangers.'
get_message(test_response, {'usage': True, 'role': True, 'isMessages': True})
# Output: {'usage': {'prompt_tokens': 56, 'completion_tokens': 31, 'total_tokens': 87},
# 'roles': ['assistant'],
# 'messages': ['The 2020 World Series was played in Arlington, Texas at the Globe Life Field, which was the new home stadium for the Texas Rangers.']}
get_message(test_response, {'usage': True, 'role': True})
# Output: {'usage': {'prompt_tokens': 56, 'completion_tokens': 31, 'total_tokens': 87},
# 'role': 'assistant',
# 'message': 'The 2020 World Series was played in Arlington, Texas at the Globe Life Field, which was the new home stadium for the Texas Rangers.'}
get_message(test_response, {'usage': True, 'isMessages': True})
# Output: {'usage': {'prompt_tokens': 56, 'completion_tokens': 31, 'total_tokens': 87},
# 'messages': ['The 2020 World Series was played in Arlington, Texas at the Globe Life Field, which was the new home stadium for the Texas Rangers.']}
About the Developer
This repository was written by Bram Adams, a writer and programmer based out of NYC.
Bram publishes a Zettelkasten, with a twice/weekly newsletter (which you can subscribe to here), is a community developer ambassador for OpenAI and does freeleance contracts (for hire!) related to AI/web dev/AR+VR.
Bram is also the creator of Stenography, a API and VSC Extension that automatically documents code on save.
You can learn more about him and his work on his website.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for gpt_to_chatgpt-0.1.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 30b45c1886ed596801d0b9ff4e2fa9862a9965a6f850aa6df25b62abc67fc788 |
|
MD5 | 2851c52f6d45c54115f1224c3c236b31 |
|
BLAKE2b-256 | f4b08a6532152e38e2ab38e2ec4a0d3fe08af107427487c962bff172ed11de59 |