Skip to main content

OpenAssist is a module for Python that allows you to create an AI assistant using import openai with ease.

Project description

OpenAssist

OpenAssist is a module for Python that allows you to create an AI assistant using openai with ease.

Installation

pip install openassist

Roles.txt Format Example:

# This is the format for the text contents of the file used picked openassist.Roles is instantiated - It defaults to roles.txt

"""
dog:You are a dog, talk like one to anything the user says next
cat:You are a cat talk like one to anything the user says next
pirate:You are a pirate talk like one to anything the user says next
"""

Usage

# ROLES :                  'system' = System Guide, 'assistant' = Ai Assistant, 'user' = User
# DEFAULT GPT DICT :       {'role': 'system', 'content': 'You are a friendly assistant.'}
# DEFAULT ROLES FILE :     'roles.txt'
# COMING SOON :            Parameters to be added to chat_completion() and completion() functions (temperature, top_p, max_tokens, stop) etc,..

"""
Example Of Importing -
    import openassist as OpenAssist - Import the module as OpenAssist
    or
    from openassist import OpenAIResponse, Roles, Memory - Import the classes individually (note case sensitivity)

Example Of Initializing Classes -
    Memory1 = OpenAssist.Memory() - Initialize the memory list
    Roles = OpenAssist.Roles("roles.txt") - Initialize the roles list with the specified file/path

Example Of OpenAI Completions -
    OpenAIRes = OpenAssist.OpenAIResponse(api_key, model_id="gpt-4") - Initialize the OpenAIResponse class (model_id is optional but defaults to gpt-4)
    OpenAIRes.chat_completion(Memory1.return_list()) - Run the OpenAI Chat Completion function
    OpenAIRes.completion(prompt, max_tokens=100) - Run the OpenAI Completion function

OpenAIResponse -
    chat_completion(memory_list) - Run the OpenAI Chat Completion function
    completion(prompt, max_tokens=100) - Run the OpenAI Completion function (max_tokens is optional but defaults to 100)

Roles -
    [Roles.] - Instance Name
    add(role_name, role_content, as_role) - Add a role to the roles list
    remove(role_name) - Remove a role from the roles list
    view() - View the roles list
    select(role_name) - Get a selected role from the roles list
    random() - Get a random role from the roles list
    format(role, content) - Format content into a role dictionary (role can be 'system', 'assistant', or 'user')

Memory -
    [Memory1.] - Instance Name
    add(dictionary) - Add a dictionary to the memory list
    remove(dictionary) - Remove a dictionary from the memory list
    remove_index(index) - Remove a dictionary from the memory list in the specified index
    insert(index, dictionary) - Insert a dictionary into the memory list at the specified index
    clear() - Clear the memory list
    view_list() - View the memory list
    return_list() - Return the memory list
    edit(dictionary, index) - Edit a dictionary in the memory list at the specified index
    get_index(dictionary) - Get the index of a dictionary in the memory list
"""

Dev Log:

# 2.2.6 - Initial Release
# 2.2.7 - Added max_tokens parameter to chat_completion() and completion() functions

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openassist-2.2.7.tar.gz (5.4 kB view hashes)

Uploaded Source

Built Distribution

openassist-2.2.7-py3-none-any.whl (7.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page