Skip to main content

LLMHub is a lightweight management platform designed to streamline working with LLMs.

Project description

LLMHub CLI

LLMHub CLI is a command-line interface tool designed to manage and interact with various LLM (Large Language Model) servers. It allows you to start, stop, update, and manage LLM processes easily and efficiently.

Features

  • Manage LLM servers
  • Start, stop, and update LLM processes
  • List available models and their statuses
  • OpenAI-compatible API endpoints for completions and models
  • Easily configurable via YAML files
  • Supports different engines and quantization formats

Installation

You can install LLMHub CLI directly from PyPI:

pip install llmhub-cli

Usage

After installation, you can use the llmhub command to interact with the tool. Below are some example commands:

Start a Process

llmhub start MythoMax-L2-13B

Stop a Process

llmhub stop MythoMax-L2-13B

Update All Processes

llmhub update

List All Models

llmhub list-models

Check Status

llmhub status

Configuration

The configuration is handled via YAML files. You can place your config.yaml file in the ~/.llmhub/ directory or specify a custom path when initializing the ConfigManager.

Example Configuration

on_start:
  MythoMax-L2-13B:
    quant: Q5_K_M
    engine: llamacppserver
    context_size: [512, 1024, 2048]

port: 8080
enable_proxy: true
engine_port_min: 8081
engine_port_max: 10000

engines:
  llamacppserver:
    path: /path/to/llamacppserver
    arguments: --color -t 20 --parallel 2 --mlock --metrics --verbose
    model_flag: "-m"
    context_size_flag: "-c"
    port_flag: "--port"
    api_key_flag: "--api-key"
    file_types: [gguf]

API Endpoints

LLMHub CLI also provides OpenAI-compatible API endpoints:

  • /v1/completions: Handle completion requests.
  • /v1/chat/completions: Handle chat completion requests.
  • /v1/models: List available models.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License.

Contact

For any questions or issues, please open an issue on the GitHub repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmhub_cli-0.1.9.tar.gz (16.0 kB view details)

Uploaded Source

Built Distribution

llmhub_cli-0.1.9-py3-none-any.whl (22.0 kB view details)

Uploaded Python 3

File details

Details for the file llmhub_cli-0.1.9.tar.gz.

File metadata

  • Download URL: llmhub_cli-0.1.9.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for llmhub_cli-0.1.9.tar.gz
Algorithm Hash digest
SHA256 d6a28f2d5c4e91faa5f209b35a7cf0b570ff35033a71f4dc80523448b8e72a77
MD5 753ffae71a8c21356dfed59c6d25aff1
BLAKE2b-256 c451a622586543f7fc7e1dc30ec3d13650958534f40655bf54a873aab5240cea

See more details on using hashes here.

File details

Details for the file llmhub_cli-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: llmhub_cli-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 22.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for llmhub_cli-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 f64269a22936f2f2a131a7ff8e66c9c92c825ef36a16a7ff86331bb685f2d376
MD5 1124276e726bbb64fcd720a8aefcb37b
BLAKE2b-256 ecef6be38df0e22a27940c01d6da8be4f1b563f25b28d23bed6e4e65172dca6e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page