Small ChatGPT library to support chat templates, and function calls
Project description
AutoChat
AutoChat is a lightweight interface to the ChatGPT API, to simplify the process of creating conversational agents.
- ChatGPT Class: Conversation wrapper to store instruction, context and messages histories.
- Message Class: Message wrapper to handle format/parsing automatically.
- Function Calls: Capability to handle function calls within the conversation, allowing complex interactions and responses.
- Template System: A straightforward text-based template system for defining the behavior of the chatbot, making it easy to customize its responses and actions.
Installation
To install the package, you can use pip:
pip install autochat
Please note that this package requires Python 3.6 or later.
Simple Example
> from autochat import ChatGPT
> chat = ChatGPT(instruction="You are a parot")
> chat.ask('Hi my name is Bob')
# Message(role=assistant, content="Hi my name is Bob, hi my name is Bob!")
> chat.ask('Can you tell me my name?')
# Message(role=assistant, content="Your name is Bob, your name is Bob!")
Template System
We provide a simple template system for defining the behavior of the chatbot, using markdown-like syntax.
## system
You are a parrot
## user
Hi my name is Bob
## assistant
Hi my name is Bob, hi my name is Bob!
## user
Can you tell me my name?
## assistant
Your name is Bob, your name is Bob!
You can then load the template file using the from_template
method:
parrotGPT = ChatGPT.from_template("./parrot_template.txt")
The template system also supports function calls. Check out the examples/classify.py for a complete example.
Function Calls Handling
The library supports function calls, handling the back-and-forth between the system and the assistant.
from autochat import ChatGPT, Message
import json
def label_item(category: str, from_response: Message):
# TODO: Implement function
raise NotImplementedError()
with open("./examples/function_label.json") as f:
FUNCTION_LABEL_ITEM = json.load(f)
classifierGPT = ChatGPT.from_template("./examples/classify_template.txt")
classifierGPT.add_function(label_item, FUNCTION_LABEL_ITEM)
text = "The new iPhone is out"
for message in classifierGPT.run_conversation(text):
print(message.to_markdown())
# > ## assistant
# > It's about \"Technology\" since it's about a new iPhone.
# > LABEL_ITEM(category="Technology")
# > ## function
# > NotImplementedError()
# > ## assistant
# > Seem like you didn't implement the function yet.
Environment Variables
The OPENAI_MODEL
environment variable specifies the OpenAI model to use. If not set, it defaults to "gpt-4".
export OPENAI_MODEL="gpt-4"
export OPENAI_API_KEY=<your-key>
Support
If you encounter any issues or have questions, please file an issue on the GitHub project page.
License
This project is licensed under the terms of the MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for autochat-0.1.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 14e2a8bebb5b0d39b8b6b31de5109ad36b49c53068899888d18d78e21ab90c6a |
|
MD5 | c255e7435f30a2d895b25c37e16c6ab7 |
|
BLAKE2b-256 | 34a1a8ba27ae82e976d4aa7c7f372b2280947170ded96e91129bb0aaa04f1bb6 |