Skip to main content

A command-line interface for Ollama API

Project description

mdllama

Build and Publish mdllama DEB and RPM

Build and Publish mdllama DEB and RPM (testing branch)

Publish to PyPI on mdllama.py Update

PPA development (GH Pages)

A CLI tool that lets you chat with Ollama and OpenAI models right from your terminal, with built-in Markdown rendering and websearch functionalities.

Features

  • Chat with Ollama models from the terminal
  • Built-in Markdown rendering
  • Web-search functionality
  • Extremely simple installation and removal (see below)

Screenshots

Chat Interface

Chat

Help

Help

Interactive Commands

When using mdllama run for interactive chat, you have access to special commands:

Basic Commands

  • exit or quit - End the interactive session
  • clear - Clear the current conversation context
  • models - Show numbered list of available models for selection
  • """ - Start/end multiline input mode for longer messages

File and Content Integration

  • file:path/to/file - Include local file content in your next message
  • site:url - Fetch and include website content in conversation context
  • system:prompt - Set or change the system prompt (use without prompt to clear)

Web Search Commands

  • search:query - Search the web and add results to conversation context
  • searchask:query|question - Search for specific query and ask a question about results
  • websearch:question - Let AI generate optimized search query and get results

Runtime Controls

  • temp:value - Change temperature setting (0.0 to 1.0)
  • model:name - Switch to different model (or show list if name omitted)

Command Line Options

  • mdllama search "query" - Standalone web search command
  • mdllama sessions - List all saved conversation sessions
  • mdllama load-session session_id - Load a previous conversation
  • mdllama clear-context - Clear current conversation context
  • mdllama models - List available models
  • mdllama pull model_name - Download a model from Ollama registry

OpenAI and Provider Support

Supported Providers

  • Ollama: Local models running on your machine
  • OpenAI: Official OpenAI API (GPT-3.5, GPT-4, etc.)
  • OpenAI-compatible: Any API that follows OpenAI's format (Hackclub AI, LocalAI, etc.)

Setup Instructions

For Ollama (Default)

mdllama setup
# Or specify explicitly
mdllama setup --provider ollama

For OpenAI

mdllama setup --provider openai
# Will prompt for your OpenAI API key

For OpenAI-Compatible APIs

mdllama setup --provider openai --openai-api-base https://ai.hackclub.com
# Then provide your API key when prompted

Usage Examples

# Use with OpenAI
mdllama chat --provider openai "Explain quantum computing"

# Use with specific model and provider
mdllama run --provider openai --model gpt-4

# Interactive session with streaming
mdllama run --provider openai --stream --render-markdown

Live Demo

Go to this mdllama demo to try it out live in your browser. The API key is 9c334d5a0863984b641b1375a850fb5d

[!NOTE] Try asking the model to give you some markdown-formatted text, or test the web search features:

  • Give me a markdown-formatted text about the history of AI.
  • search:Python 3.13 (web search)
  • site:python.org (fetch website content)
  • websearch:What are the latest Python features? (AI-powered search)

So try it out and see how it works!

Installation

Install using package manager (recommended)

Debian/Ubuntu Installation

  1. Add the PPA to your sources list:

    echo 'deb [trusted=yes] https://packages.qincai.xyz/debian stable main' | sudo tee /etc/apt/sources.list.d/qincai-ppa.list
    sudo apt update
    
  2. Install mdllama:

    sudo apt install python3-mdllama
    

Fedora Installation

  1. Download the latest RPM from: https://packages.qincai.xyz/fedora/

    Or, to install directly:

    sudo dnf install https://packages.qincai.xyz/fedora/mdllama-<version>.noarch.rpm
    

    Replace <version> with the latest version number.

  2. (Optional, highly recommended) To enable as a repository for updates, create /etc/yum.repos.d/qincai-ppa.repo:

    [qincai-ppa]
    name=Raymont's Personal RPMs
    baseurl=https://packages.qincai.xyz/fedora/
    enabled=1
    metadata_expire=0
    gpgcheck=0
    

    Then install with:

    sudo dnf install mdllama
    

3, Install the ollama library from pip:

pip install ollama

You can also install it globally with:

sudo pip install ollama

[!NOTE] The ollama library is not installed by default in the RPM package since there is no system ollama package avaliable (python3-ollama). You need to install it manually using pip in order to use mdllama with Ollama models. This issue has been resolved by including a post-installation script for RPM packages that automatically installs the ollama library using pip.


PyPI Installation (Cross-Platform)

Install via pip (recommended for Windows/macOS and Python virtual environments):

pip install mdllama

Traditional Bash Script Installation (Linux)

[!WARNING] This method of un-/installation is deprecated and shall be avoided Please use the pip method, or use DEB/RPM packages instead

To install mdllama using the traditional bash script, run:

bash <(curl -fsSL https://raw.githubusercontent.com/QinCai-rui/mdllama/refs/heads/main/install.sh)

To uninstall mdllama, run:

bash <(curl -fsSL https://raw.githubusercontent.com/QinCai-rui/mdllama/refs/heads/main/uninstall.sh)

License

This project is licensed under the GNU General Public License v3.0. See the LICENSE file for details.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mdllama-4.2.4.tar.gz (48.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mdllama-4.2.4-py3-none-any.whl (51.6 kB view details)

Uploaded Python 3

File details

Details for the file mdllama-4.2.4.tar.gz.

File metadata

  • Download URL: mdllama-4.2.4.tar.gz
  • Upload date:
  • Size: 48.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mdllama-4.2.4.tar.gz
Algorithm Hash digest
SHA256 c1b7b0eb8ddc1dfd5667f09a24d333931c3ffa31f31a33d82edd6be544481419
MD5 359237808c75fd9cb7535eda5ed7ecc1
BLAKE2b-256 de8fef8665af96a89e6e64c7190f7438e9791ee83bff6564b9e3fd40d7151b20

See more details on using hashes here.

Provenance

The following attestation bundles were made for mdllama-4.2.4.tar.gz:

Publisher: publish-to-pypi.yml on QinCai-rui/mdllama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mdllama-4.2.4-py3-none-any.whl.

File metadata

  • Download URL: mdllama-4.2.4-py3-none-any.whl
  • Upload date:
  • Size: 51.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mdllama-4.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 6fd32f015dfd3bad3e0a3f223f8cdd8c9cd3b3e05448d994f4072ea70e4d1c85
MD5 f6d21b984d414ea3169e4a5442cc1b79
BLAKE2b-256 ed8446c1c32ee5958cd1e1f4ef52332231003ee931816ec23f0e216155b9b66f

See more details on using hashes here.

Provenance

The following attestation bundles were made for mdllama-4.2.4-py3-none-any.whl:

Publisher: publish-to-pypi.yml on QinCai-rui/mdllama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page