Skip to main content

A CLI tool to prompt LLMs

Project description

prompt_llm

Version

0.1.2

  • Prompt LLMs via cli

Installation

  1. Clone the repo

  2. Install

  • Local:
    • pip install --editable .
  • PyPi:
    • pip install prompt_llm
  1. Install auto-completion:
  • prompt_llm --install-completion
  • source ~/.bashrc

Config:

  • Supported configs: --api-key, model, --system, --temperature

Set Config

  • add_config --api-key "<api_key>" --system "<system_messgae>" --temperature <temperature>

Remove Config:

  • prompt_llm rm-config temperature

View Config:

  • prompt_llm view-config

Run cli

  • prompt_llm openai "Tell me something"

To contribute

1. Build project

python -m build

### 2. Upload python3 -m twine upload --repository pypi dist/* --verbose

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_llm-0.1.2.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_llm-0.1.2-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file prompt_llm-0.1.2.tar.gz.

File metadata

  • Download URL: prompt_llm-0.1.2.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for prompt_llm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 fd4eb4c5b81b5b0213f74ee677ed74e419939914395115588e7d9c32ea41021b
MD5 1cef5eb56c92e6418f9f00f6bde9a512
BLAKE2b-256 043fded52ae06d551c9bf0c4e0ffffb8431dbe4be5af0e46915fa25668690e72

See more details on using hashes here.

File details

Details for the file prompt_llm-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: prompt_llm-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 6.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for prompt_llm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 927ac31e190cffc1faa307304311577218e94fa3e0122924d0f0eee27f6e4fd4
MD5 f9a352849bc716ce26abe9270b5ff041
BLAKE2b-256 ca6d0660e6167a7539b020e80d950d1d867b1306313fa540a32f2b63f8ec6e0a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page