Skip to main content

A Python wrapper for OpenAI API interactions.

Project description

OpenAIFlow


OpenAIFlow is a Python library designed to simplify interactions with the OpenAI API, allowing you to create and manage assistants, threads, and messaging workflows effortlessly.

Features

  • Validate and manage OpenAI API keys
  • Create and manage custom assistants
  • Create assistants with files
  • Start new threads for conversations with assistants
  • Chat in different formats (console, interactive)
  • Retrieve and parse the latest assistant responses
  • Installation

To install the library, run:

pip install openaiflow

Getting Started

1. Setup

Set up your OpenAI API key: Create a .env file in your project directory and add your OpenAI API key:

KEY = `your_openai_api_key`;

Alternatively, you can pass the API key directly [this is not a good practice] when initializing OpenAIWrapper.

2. Initialize OpenAIWrapper:

from openaiflow import openaiflow
import os

# Load API key from environment
api_key = os.getenv("KEY")
client = OpenAIWrapper(api_key)

3. Validating Your API Key

if client.validate_api_key():
	print("API key is valid.")
else:
	print("Invalid API key.")

Your OpenAI client is now up and running.

4. Creating an Assistant

An assistant is a configured AI persona, defined by its name, instructions (like tone and purpose), and model (e.g., gpt-3.5-turbo). Different assistants can be tailored for specific tasks like support, creativity, or information.

You can create a custom assistant by providing a name, instructions, and model type:

assistant = client.create_assistant(
name="Test Assistant",
instructions="You are a helpful assistant.",
model="gpt-3.5-turbo"
)

5. Starting a New Thread

A thread is a session-based conversation with an assistant, maintaining context across messages in that session. Multiple threads can be created with the same assistant, each handling different topics or interactions independently.

A thread is used to initiate a conversation with the assistant:

thread = client.create_thread(assistant_id="your_assistant_id")

6. Interactive Chat

Use the interactive_chat function for a back-and-forth conversation with the assistant:

response = client.interactive_chat(
thread_id="your_thread_id",
assistant_id="your_assistant_id",
message="Hello, how can you help me?"
)
print("Assistant:", response)

A very basic usage on interactive_chat

key = True # flag to show an ongoing conversation
while key:
	user_input = input() # this input can come in any form

    response = client.interactive_chat(
		thread_id="your_thread_id",
		assistant_id="your_assistant_id",
		message="Hello, how can you help me?"
	)
    print(f"Assistant Response: {response}")

4. Console Chat

For a console-based chat where you can type messages directly:

client.chat(
input_type="console",
assistant_id="your_assistant_id",
thread_id="your_thread_id"
)

Type exit to end the chat session.

5. Handling Messages

OpenAIFlow also allows you to handle incoming and outgoing messages in your threads. For example:

response, thread_id, run_id = client.handle_message(
message="What can you do?",
thread_id="your_thread_id",
assistant_id="your_assistant_id"
)

print("Assistant:", response)

6. Parsing Responses

If you need to parse a response from the assistant:

parsed_response = client.parse_assistant_response(response)
print("Assistant says:", parsed_response[0])

7. Error Handling

OpenAIFlow includes structured error handling, so you can handle issues with API keys, message failures, and more.

Example:

try:
	client.create_thread("invalid_id")
except ValueError as e:
	print(e) # Outputs error message

TODOs & Future Improvements

  • Allow customization of model parameters
  • Add adjustable sleep intervals for response polling
  • Add image support
  • Add streaming support
  • Store messages in memory for easy retrieval and context switching

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openaiflow-0.1.3a0.tar.gz (7.5 kB view details)

Uploaded Source

Built Distribution

openaiflow-0.1.3a0-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file openaiflow-0.1.3a0.tar.gz.

File metadata

  • Download URL: openaiflow-0.1.3a0.tar.gz
  • Upload date:
  • Size: 7.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for openaiflow-0.1.3a0.tar.gz
Algorithm Hash digest
SHA256 eaa10146666abc3a0f5ece8f45660678cdcc3f636ade8d23235280114b1cf33b
MD5 848735e52270ab82a493fc81a00fb0f5
BLAKE2b-256 f246041e8cbd5aa4decd6dc9d42b644e48c00a027a658c6884211326ac2983c0

See more details on using hashes here.

File details

Details for the file openaiflow-0.1.3a0-py3-none-any.whl.

File metadata

File hashes

Hashes for openaiflow-0.1.3a0-py3-none-any.whl
Algorithm Hash digest
SHA256 2aa281068dbf08b0d543cfe615448907a43c687dd3a6b875d669b25a6c909ff5
MD5 7c6b407007de35bb22ca6cb2a70d2141
BLAKE2b-256 a49ab3084ab06cff74f24ddce5c5552c45bdeb805b76fbc44e94ca325d9584a2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page