Skip to main content

A Python-based CLI tool to quickly interact with OpenAIs GPT models instead of relying on the web interface.

Project description

qprom a.k.a Quick Prompt - ChatGPT CLI

OS RunsOn Open Source

qprom

A Python-based CLI tool to quickly interact with OpenAI's GPT models instead of relying on the web interface.

Table of Contents

  1. Description
  2. Installation
  3. Setup
  4. Usage
  5. Todos
  6. License

Description

qprom is a small project that lets you interact with OpenAI's GPT-4 and 3.5 chat API, quickly without having to use the web-ui. This enables quicker response times and better data privacy

Installation

pip install qprom

Setup

Make sure you have your OpenAI API key.

When running qprom the script tries to fetch the OpenAI API key from a credentials file located in the .qprom folder within the user's home directory. If the API key is not found in the credentials file, the user is prompted to provide it, and the provided key is then stored in the aforementioned credentials file for future use.

Usage

Argument Type Default Choices Description Optional
-p String None None Option to directly enter your prompt (Do not use this flag if you intend to have a multi-line prompt.) yes
-m String gpt-3.5-turbo gpt-3.5-turbo, gpt-4, ... Option to select the model yes
-M String gpt-3.5-turbo gpt-3.5-turbo, gpt-4, ... Set the default model yes
-t Float 0.3 Between 0 and 2 Option to configure the temperature yes
-v Boolean False None Enable verbose mode yes
-c Boolean False None Enable conversation mode yes
-tk String 6500 None Option to set the currently used token limit yes
-TK String 6500 None Option to configure the currently used token limit yes

Usage

qprom -p <prompt> -m <model> -t <temperature> -v -c
  • <prompt>: Replace with your prompt
  • <model>: Replace with either gpt-3.5-turbo or gpt-4
  • <temperature>: Replace with a float value between 0 and 2
  • -v: Add this flag to enable verbose mode
  • -c: Add this flag to enable conversation mode

For example:

qprom -p "Translate the following English text to French: '{text}'" -m gpt-4 -t 0.7 -v

This will run the script with the provided prompt, using the gpt-4 model, a temperature of 0.7, and verbose mode enabled.

Multi line prompting

To facilitate multi-line input for the prompt, invoke qprom without utilizing the -p parameter. This will prompt you for your input at runtime, where you can provide multiple lines as needed. To signal the end of your input, simply enter the string 'END'.

qprom

This will run qprom with default values model: gpt-3.5-turbo, a temperature of 0.7 and ask for the prompt during runtime.

Set default model

qprom -M <model-name>

Set token limit for prompt/conversation

qprom -tk <token-limit>

Set default token limit

qprom -TK <token-limit>

Piping console input into qprom

Just pipe the prompt into qprom.

cat prompt.txt | qprom

Todos

  • Cleanup project / refactoring
  • Add option to set default temperature
  • Add option to re-set the API token
  • Testing
  • Add option to disable streaming and only print the full response

Bug reports:

License

MIT Link

Support me :heart: :star: :money_with_wings:

If this project provided value, and you want to give something back, you can give the repo a star or support by buying me a coffee.

Buy Me A Coffee

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qprom-0.6.0.tar.gz (9.7 kB view details)

Uploaded Source

File details

Details for the file qprom-0.6.0.tar.gz.

File metadata

  • Download URL: qprom-0.6.0.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for qprom-0.6.0.tar.gz
Algorithm Hash digest
SHA256 336f69cc50343f5ad22e14a46a4cd5c414ab2947850f5afd698d6cdff328e927
MD5 515d0de3b33668461eb8384b2c68e6b5
BLAKE2b-256 33dce61f9b80f6d0fb07cf2b260bbc3d1dc583e75a730b00b21a060758c11229

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page