No project description provided
Project description
chatgpt-prompt-wrapper
Python CLI implementation for ChatGPT.
Requirement
- Python 3.9, 3.10, 3.11
- Poetry (For development)
Installation
$ pip install chatgpt-prompt-wrapper
Usage
Command line interface
$ cg help
usage: cg [-h] [-k KEY] [-c CONF] [-m MODEL] [-t TOKENS] [--show] [--hide] [--multiline]
[--no_multiline] [--show_cost]
subcommand [message ...]
positional arguments:
subcommand Subcommand to run. Use 'commands' subcommand to list up available subcommands.
message Message to send to ChatGPT
optional arguments:
-h, --help show this help message and exit
-k KEY, --key KEY OpenAI API key.
-c CONF, --conf CONF Path to the configuration toml file.
-m MODEL, --model MODEL
ChatGPT Model to use.
-t TOKENS, --tokens TOKENS
The maximum number of tokens to generate in the chat completion. Set 0 to use
the max values for the model minus prompt tokens.
--show Show prompt for ask command.
--hide Hide prompt for ask command.
--multiline Use multiline input for chat command.
--no_multiline Use single line input for chat command.
--show_cost Show cost used.
$ cg commands
Available subcommands:
Reserved commands:
init : Initialize config file with an example command.
cost : Show estimated cost used until now.
commands : List up subcommands (show this).
version : Show version.
help : Show help.
User commands:
test : Example command to test the OpenAI API.
...
Configuration file
File path
The default path to the configuration file is $XDG_CONFIG_HOME/cg/config.toml.
If $XDG_CONFIG_HOME is not defined, use ~/.config/cg/config.toml.
If it does not exist and ~/.cg/config.toml exists, the existing file is used.
You can change the path by -c <file>
(--conf <file>
) option.
How to write the configuration file
The configuration file is written in the TOML format.
Subcommand is defined as the top table name.
The options for each table can be:
description
: Description of the command.chat
: Settrue
to make the command chat mode (default is ask mode, only one exchange).show_cost
: Settrue
to show cost at the end of the command.model
: The model to use. (default: "gpt-3.5-turbo")max_tokens
: The maximum number of tokens to generate in the chat completion. Set 0 to use the max values for the model. (default: 0)temperature
: Sampling temperature (0 ~ 2). (default: 1)top_p
: Probability (0 ~ 1) that the model will consider the top_p tokens. Do not set both temperature and top_p in the same time. (default: 1)presence_penalty
: The penalty for the model to return the same token (-2 ~ 2). (default: 0)frequency_penalty
: The penalty for the model to return the same token multiple times (-2 ~ 2). (default: 0)- List of
messages
: Dictionary of message, which must haverole
('system', 'user' or 'assistant') andcontent
(message text).
The options for ask mode:
show
: Settrue
to show prompt for non chat command.hide
: Settrue
to hide prompt for non chat command (default).
The options for chat mode:
multiline
: Settrue
to hide prompt for non chat command (default).no_multiline
: Settrue
to hide prompt for non chat command.
Here is a example configuration (if you execute cg init
at the first time, this configuration file is created).
[ask]
description = "Ask a question w/o predefined prompt."
[test]
# Example command to test the OpenAI API, taken from below.
# [Chat completion - OpenAI API](https://platform.openai.com/docs/guides/chat/introduction)
description = "Example command to test the OpenAI API."
show = true
[[test.messages]]
role = "system"
content = "You are a helpful assistant."
[[test.messages]]
role = "user"
content = "Who won the world series in 2020?"
[[test.messages]]
role = "assistant"
"content" = "The Los Angeles Dodgers won the World Series in 2020."
[[test.messages]]
role = "user"
content = "Where was it played?"
[sh]
description = "Ask a shell scripting question."
[[sh.messages]]
role = "user"
content = "You are an expert of the shell scripting. Answer the following questions."
[py]
description = "Ask a python programming question."
[[py.messages]]
role = "user"
content = "You are an expert python programmer. Answer the following questions."
[chat]
description = "Chat with the assistant."
chat = true
[[chat.messages]]
role = "user"
content = "Let's enjoy a chat."
These messages will be sent as an prompt before your input message.
You can give full questions and use cg
w/o input messages like a first example test
command.
Command examples:
- test
- sh
- py
- caht
Development
Poetry
Use Poetry to setup environment.
To install poetry, run:
$ pip install poetry
or use pipx
(x
is 3
or anything of your python version).
Setup poetry environment:
$ poetry install
Then enter the environment:
$ poetry shell
pre-commit
To check codes at the commit, use pre-commit.
pre-commit
command will be installed in the poetry environment.
First, run:
$ pre-commit install
Then pre-commit
will be run at the commit.
Sometimes, you may want to skip the check. In that case, run:
$ git commit --no-verify
You can run pre-commit
on entire repository manually:
$ pre-commit run -a
pytest
Tests are written with pytest.
Write tests in /tests directory.
To run tests, run:
$ pytest
The default setting runs tests in parallel with -n auto
.
If you run tests in serial, run:
$ pytest -n 0
GitHub Actions
If you push a repository to GitHub, GitHub Actions will run a test job by GitHub Actions.
The job runs at the Pull Request, too.
It checks codes with pre-commit
and runs tests with pytest
.
It also makes a test coverage report and uploads it to the coverage branch.
You can see the test status as a badge in the README.
Renovate
If you want to update dependencies automatically, install Renovate into your repository.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chatgpt_prompt_wrapper-0.0.2.tar.gz
.
File metadata
- Download URL: chatgpt_prompt_wrapper-0.0.2.tar.gz
- Upload date:
- Size: 19.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.11.2 Darwin/22.4.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0c2c0d577f28707966ad58cbbb2843150e184de5663c836dccb95b0a7e89fba1 |
|
MD5 | 89e0991c370256a78af4f4203b4a3f3b |
|
BLAKE2b-256 | 8ee01e991113e0506b50747d35b296360a287a61cc53062e5f8a8d8ea2e9d083 |
File details
Details for the file chatgpt_prompt_wrapper-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: chatgpt_prompt_wrapper-0.0.2-py3-none-any.whl
- Upload date:
- Size: 21.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.11.2 Darwin/22.4.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4cb67d25f9bac93b40df46ed0a0c629cbb23c24d143d2b19ca64ae5a0ead8814 |
|
MD5 | a0d9bde7177c0f6b7c611fb0999726c7 |
|
BLAKE2b-256 | 9cd72d9d5bf9fbac1d03cebed95dbe37b25976868f19645d07c3801dc5d6bbd8 |