Skip to main content

A CLI tool to prompt LLMs

Project description

prompt_llm

Version

0.1.3

  • Prompt LLMs via cli

Installation

  1. Clone the repo

  2. Install

  • Local:
    • pip install --editable .
  • PyPi:
    • pip install prompt_llm
  1. Install auto-completion:
  • prompt_llm --install-completion
  • source ~/.bashrc

Config:

  • Supported configs: --api-key, model, --system, --temperature

Set Config

  • add_config --api-key "<api_key>" --system "<system_messgae>" --temperature <temperature>

Remove Config:

  • prompt_llm rm-config temperature

View Config:

  • prompt_llm view-config

Run cli

  • prompt_llm openai "Tell me something"

To contribute

1. Build project

python -m build

### 2. Upload python3 -m twine upload --repository pypi dist/* --verbose

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_llm-0.1.3.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_llm-0.1.3-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file prompt_llm-0.1.3.tar.gz.

File metadata

  • Download URL: prompt_llm-0.1.3.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for prompt_llm-0.1.3.tar.gz
Algorithm Hash digest
SHA256 3f6b3619ad64a8e2f40a28087efd13c285074c1e6f132568370739d6a2650c78
MD5 a5bd1aa473c15e4b915a73238de55e95
BLAKE2b-256 7e72f77bd5074747e18c763d9decc68cdb810c6940162147d121f749c495a1fb

See more details on using hashes here.

File details

Details for the file prompt_llm-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: prompt_llm-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 6.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for prompt_llm-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 71a9e62319adf3d63208e8c0397d37492896c435b7bb9ed1b3f58537c30fe377
MD5 3a53f380ee217942dfb0ebf63f1f0199
BLAKE2b-256 229b8b7fd8fc79fb6321214c76523e4bceaff250e5ba97d76b1fc32685641b5d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page