Skip to main content

Interact with any LLM from your terminal

Project description

Lhammai CLI logo

License Tests Publish to PyPI

✨ Interact with any LLM from your terminal


Lhammai CLI allows you to interact with any LLM directly from your terminal using a simple, intuitive interface. Powered by the any-llm library, it seamlessly connects to various LLM providers, including OpenAI, Anthropic, and local servers such as Ollama, llamafile, and others. For a full list of supported providers, see the official any-llm documentation.

The name Lhammai comes from "Lhammas," Noldorin for "account of tongues", a work of fictional sociolinguistics, written by J. R. R. Tolkien in 1937.

Getting Started

Prerequisites

Installation

You can install the package from PyPI using pip (recommended):

pip install "lhammai-cli[ollama]"

From Source

  1. Clone the repository and navigate to the source directory:

    git clone https://github.com/dpoulopoulos/lhammai-cli.git && cd lhammai-cli
    
  2. Install the dependencies using uv:

    uv sync --group ollama
    
  3. Activate the virtual environment:

    source .venv/bin/activate
    

This installs the necessary dependencies to communicate with a local model via Ollama.

Usage

To begin, you'll need to run the Ollama server. For this example, you can use Docker for a quick setup.

This approach has some limitations, especially on a Mac. Since Docker Desktop doesn't support GPUs, it's better to run Ollama as a standalone application if you're using a Mac. For more detailed instructions, check the official Ollama documentation.

  1. Run the following command to start the Ollama server in a Docker container:

    a. CPU only:

    docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
    

    b. Nvidia GPU:

    docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
    
  2. Run a model:

    docker exec -it ollama ollama run gemma3:4b
    
  3. Interact with the model:

    lhammai Hello!
    

Configure your application by creating a .env file in the root directory and adding your options: cp .default.env .env

You can also pipe content to lhammai from standard input. This is useful for analyzing logs, summarizing files, etc.:

cat dev.log | lhammai -p "explain:"

License

See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lhammai_cli-0.1.0a10.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lhammai_cli-0.1.0a10-py3-none-any.whl (10.6 kB view details)

Uploaded Python 3

File details

Details for the file lhammai_cli-0.1.0a10.tar.gz.

File metadata

  • Download URL: lhammai_cli-0.1.0a10.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for lhammai_cli-0.1.0a10.tar.gz
Algorithm Hash digest
SHA256 febecdd9b3330c3ffb33962f46813b8be698a6351f0907efaed424079aad0d23
MD5 bf0dd088f1847f0d233a3f50722e4d7b
BLAKE2b-256 94ef53aec9526a6eefae0774e2eee33ead19921794b2ebca61f232f3320afa4f

See more details on using hashes here.

Provenance

The following attestation bundles were made for lhammai_cli-0.1.0a10.tar.gz:

Publisher: release.yml on dpoulopoulos/lhammai-cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file lhammai_cli-0.1.0a10-py3-none-any.whl.

File metadata

  • Download URL: lhammai_cli-0.1.0a10-py3-none-any.whl
  • Upload date:
  • Size: 10.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for lhammai_cli-0.1.0a10-py3-none-any.whl
Algorithm Hash digest
SHA256 4a04e23a441314f7278a4e80b482f1f11ca9530ab0f6ca4aec0abfd2430fb167
MD5 2d977b6fd9b3e0fdbe16a11ce2e0658a
BLAKE2b-256 22842a272522efe0464b6e8e42b021a67ad994d30c33c6ea5da18bfbd495ce7a

See more details on using hashes here.

Provenance

The following attestation bundles were made for lhammai_cli-0.1.0a10-py3-none-any.whl:

Publisher: release.yml on dpoulopoulos/lhammai-cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page