OPENAI tool that runs your prompts over a document
Project description
GPTEngineModel Class Documentation
The GPTEngineModel
class is designed to interact with the OpenAI GPT-3.5 Turbo engine using prompts from a .doc file. It provides a convenient way to generate responses based on tasks and input prompts.
Table of Contents
Installation
Before using the GPTEngineModel
class, make sure you have the required dependencies installed:
pip install openai
pip install python-docx
Usage
To use the GPTEngineModel
class, follow these steps:
-
Create a configuration file containing your OpenAI API key. Refer to the
config.py
file in the code for an example. -
Import the
GPTEngineModel
class:
from your_module import GPTEngineModel
- Initialize an instance of the GPTEngineModel class with your main task and input prompt:
engine = GPTEngineModel("Main task description", "Input prompt text")
- Use the generate_response method to generate responses based on a .doc file:
response = engine.generate_response("path/to/your/file.docx")
print(response)
Class Details
Attributes
main_task
(str): The main task description.input_prompt
(str): The input prompt for the GPT-3.5 Turbo engine.api_key
(str): The OpenAI API key for authentication.
Methods
generate_response(file_path: str) -> str
Generate a response from OpenAI based on the provided .doc file.
-
Parameters:
file_path
(str): The path to the .doc file.
-
Returns:
str
: The response from OpenAI.
Private Methods
_setup_openai()
: Set up the OpenAI API key for authentication._get_context() -> str
: Get the context for the current task._get_file(file_path: str) -> str
: Read the .doc file and return the text content._batch_process(file_path: str) -> List[dict]
: Batch process the instructions and input prompt.
Example
from your_module import GPTEngineModel
# Initialize the GPTEngineModel
engine = GPTEngineModel("Summarize an article", "Please summarize the following article:")
# Generate a response from OpenAI based on a .doc file
response = engine.generate_response("path/to/your/article.docx")
print(response)
Contributing
If you would like to contribute to this project or report issues, please refer to the project's GitHub repository https://github.com/mbsuraj/gptEngine.git.
License
This project is licensed under the MIT. See the LICENSE.md file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for gptEngine-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9292be28813eecacd2a096b90efa695bca6da9df9c157d3f788f60f2f4ea560c |
|
MD5 | 347ebf3053d3d6dd06b162316536c354 |
|
BLAKE2b-256 | 9c89c1bb214d2117c9f5114bb6b05a6a16602de3d747900dd243ffc0866cf293 |