Skip to main content

Small ChatGPT library to support chat templates, and function calls

Project description


AutoChat is a lightweight interface to the ChatGPT API, to simplify the process of creating conversational agents.

  • ChatGPT Class: Conversation wrapper to store instruction, context and messages histories.
  • Message Class: Message wrapper to handle format/parsing automatically.
  • Function Calls: Capability to handle function calls within the conversation, allowing complex interactions and responses.
  • Template System: A straightforward text-based template system for defining the behavior of the chatbot, making it easy to customize its responses and actions.


To install the package, you can use pip:

pip install autochat

Please note that this package requires Python 3.6 or later.

Simple Example

> from autochat import ChatGPT
> chat = ChatGPT(instruction="You are a parot")
> chat.ask('Hi my name is Bob')
# Message(role=assistant, content="Hi my name is Bob, hi my name is Bob!")
> chat.ask('Can you tell me my name?')
# Message(role=assistant, content="Your name is Bob, your name is Bob!")

Template System

We provide a simple template system for defining the behavior of the chatbot, using markdown-like syntax.

## system
You are a parrot

## user
Hi my name is Bob

## assistant
Hi my name is Bob, hi my name is Bob!

## user
Can you tell me my name?

## assistant
Your name is Bob, your name is Bob!

You can then load the template file using the from_template method:

parrotGPT = ChatGPT.from_template("./parrot_template.txt")

The template system also supports function calls. Check out the examples/ for a complete example.

Function Calls Handling

The library supports function calls, handling the back-and-forth between the system and the assistant.

from autochat import ChatGPT, Message
import json

def label_item(category: str, from_response: Message):
    # TODO: Implement function
    raise NotImplementedError()

with open("./examples/function_label.json") as f:
    FUNCTION_LABEL_ITEM = json.load(f)

classifierGPT = ChatGPT.from_template("./examples/classify_template.txt")
classifierGPT.add_function(label_item, FUNCTION_LABEL_ITEM)

text = "The new iPhone is out"
for message in classifierGPT.run_conversation(text):

# > ## assistant
# > It's about \"Technology\" since it's about a new iPhone.
# > LABEL_ITEM(category="Technology")
# > ## function
# > NotImplementedError()
# > ## assistant
# > Seem like you didn't implement the function yet.

Environment Variables

The OPENAI_MODEL environment variable specifies the OpenAI model to use. If not set, it defaults to "gpt-4".

export OPENAI_MODEL="gpt-4"
export OPENAI_API_KEY=<your-key>


If you encounter any issues or have questions, please file an issue on the GitHub project page.


This project is licensed under the terms of the MIT license.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autochat-0.1.10.tar.gz (9.0 kB view hashes)

Uploaded source

Built Distribution

autochat-0.1.10-py3-none-any.whl (8.6 kB view hashes)

Uploaded py3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page