Skip to main content

A minimal CLI chat application using transformers with core tool-use support.

Project description

Minimal Transformers CLI Chat Application

A simple command-line interface to chat with any instruction-tuned LLM from the Hugging Face Hub using the transformers library.

Features

  • Interactive CLI chat.
  • Supports any model with a chat template.
  • Automatic device mapping (CPU/GPU).
  • Session management: Save and load chat history automatically.
  • Opt-in Core Tools: Choose which system tools to enable for the model.

Prerequisites

  • Python 3.8+
  • (Optional but recommended) A virtual environment.

Installation

  1. Clone the repository and install it using pip:
    pip install .
    

Usage

Basic Chat

Once installed, you can start the chat from anywhere in your terminal using the colab_chat command:

colab_chat

Enabling Core Tools

By default, all tools are disabled for security. You can enable them individually using flags:

  • --calculate: Enable mathematical expression evaluation.
  • --shell: Enable execution of bash shell commands.
  • --read: Enable reading files from your system.
  • --write: Enable writing or updating files on your system.
  • --yolo: Enable all core tools at once (BE CAREFUL).
  • --system_prompt: Path to a text file containing a custom system prompt.

Example: Enable all tools

colab_chat --yolo

[!WARNING] Security Note: Enabling --shell, --read, --write, or --yolo gives the LLM direct access to your system. Always review the tool calls (indicated by [*] Calling tool: ...) in your console.

Save/Load Sessions

Sessions are saved to the sessions/ directory in your current working folder.

  • Load a previous session:
    colab_chat --load sessions/chat_20260407_120000.json
    

Advanced Options

  • --model: Hugging Face model ID.
  • --max_tokens: Maximum new tokens to generate (default: 8192).
  • --no_save: Disable session saving.
  • --session_name: Set a custom name for the saved session.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

colab_chat-0.1.0.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

colab_chat-0.1.0-py3-none-any.whl (18.2 kB view details)

Uploaded Python 3

File details

Details for the file colab_chat-0.1.0.tar.gz.

File metadata

  • Download URL: colab_chat-0.1.0.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for colab_chat-0.1.0.tar.gz
Algorithm Hash digest
SHA256 dfc3d399d3556244e1d0ca0543b8253a539f4549513317903ab7b5cccbdba1c7
MD5 7bf88bd39aed8a63ced9eea8bebb4808
BLAKE2b-256 37a8bb354a91539d29fbe1d62ea3effc9005d8f68d070ea33ba576a6be0fe18d

See more details on using hashes here.

File details

Details for the file colab_chat-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: colab_chat-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 18.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for colab_chat-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 496a4281601fdc6a7935eaf5975bf9cafb150ee19f21801f6d10b62625a5b336
MD5 774f7de4cdf7c3a77f6f724a4dd2ec7c
BLAKE2b-256 fb5f30c987c281900a274ae6d091ce3eb2ae3f8bc03458ea3ddd82eddc63e15c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page