Skip to main content

A Python-based CLI tool to quickly interact with OpenAIs GPT models instead of relying on the web interface.

Project description

qprom a.k.a Quick Prompt

OS RunsOn Open Source

qprom

A Python-based CLI tool to quickly interact with OpenAI's GPT models instead of relying on the web interface.

Table of Contents

  1. Description
  2. Installation
  3. Usage
  4. Todos
  5. License

Description

qprom is a small project that lets you interact with OpenAI's GPT-4 and 3.5 chat API, quickly without having to use the web-ui. This enables quicker response times and better data privacy

Installation

pip install qprom

Usage

Argument Type Default Choices Description Optional
-p String None None Option to directly enter your prompt (Do not use this flag if you intend to have a multi-line prompt.) yes
-m String gpt-4 gpt-3.5-turbo, gpt-4 Option to select the model yes
-t Float 0.3 Between 0 and 2 Option to configure the temperature yes
-v Boolean False None Enable verbose mode yes

Usage

qprom -p <prompt> -m <model> -t <temperature> -v
  • <prompt>: Replace with your prompt
  • <model>: Replace with either gpt-3.5-turbo or gpt-4
  • <temperature>: Replace with a float value between 0 and 2
  • -v: Add this flag to enable verbose mode

For example:

qprom -p "Translate the following English text to French: '{text}'" -m gpt-4 -t 0.7 -v

This will run the script with the provided prompt, using the gpt-4 model, a temperature of 0.7, and verbose mode enabled.

Multi line promting

To facilitate multi-line input for the prompt, invoke qprom without utilizing the -p parameter. This will prompt you for your input at runtime, where you can provide multiple lines as needed. To signal the end of your input, simply enter the string 'END'.

qprom

This will run qprom with default values model: gpt-4, a temperature of 0.7 and ask for the prompt during runtime.

Todos

  • Testing
  • Add conversation mode
  • Add option to select default model in config
  • Add option to disable streaming and only print the full response

Bug reports:

License

MIT Link

Support me :heart: :star: :money_with_wings:

If this project provided value, and you want to give something back, you can give the repo a star or support by buying me a coffee.

Buy Me A Coffee

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qprom-0.5.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

qprom-0.5-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file qprom-0.5.tar.gz.

File metadata

  • Download URL: qprom-0.5.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for qprom-0.5.tar.gz
Algorithm Hash digest
SHA256 e988d0cd9898cf89c7a99d7de495ec787ec70c6711b46ca4bdc777fa0f12d827
MD5 206a674165af7cc94e92551eeb487f65
BLAKE2b-256 f4eb8c2cbe54acc8bdcd0e0cbb6489a891d91f415fc5e22f5e8e8c045d150beb

See more details on using hashes here.

File details

Details for the file qprom-0.5-py3-none-any.whl.

File metadata

  • Download URL: qprom-0.5-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for qprom-0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 4d2dda64aaecb76f7f69801b488d7395e4cffbf06a1b462d7f6b5be946ea0e34
MD5 e3beb7fb4abbfb0691df28bb6baf5cb8
BLAKE2b-256 f793fd10e74f7cdc0186a3fbfcf208aaf7dae91af213c4dc6a080819e8a8ce0f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page