Skip to main content

A command-line interface for Ollama API

Project description

mdllama

Build and Publish mdllama DEB and RPM

Build and Publish mdllama DEB and RPM (testing branch)

Publish to PyPI on mdllama.py Update

PPA development (GH Pages)

A CLI tool that lets you chat with Ollama and OpenAI models right from your terminal, with built-in Markdown rendering and websearch functionalities.

Features

  • Chat with Ollama models from the terminal
  • Built-in Markdown rendering
  • Web-search functionality
  • Extremely simple installation and removal (see below)

Screenshots

Chat Interface

Chat

Help

Help

Interactive Commands

When using mdllama run for interactive chat, you have access to special commands:

See man(1) mdllama for details.

man 1 mdllama

OpenAI and Provider Support

Supported Providers

  • Ollama: Local models running on your machine
  • OpenAI: Official OpenAI API (GPT-3.5, GPT-4, etc.)
  • OpenAI-compatible: Any API that follows OpenAI's format (I like Groq, so use it! https://groq.com)

Setup Instructions

For Ollama (Default)

mdllama setup
# Or specify explicitly
mdllama setup --provider ollama

For OpenAI

mdllama setup --provider openai
# Will prompt for your OpenAI API key

For OpenAI-Compatible APIs

mdllama setup --provider openai --openai-api-base https://ai.hackclub.com
# Then provide your API key when prompted

Usage Examples

# Use with OpenAI
mdllama chat --provider openai "Explain quantum computing"

# Use with specific model and provider
mdllama run --provider openai --model gpt-4

# Interactive session with streaming
mdllama run --provider openai --stream=true --render-markdown

Live Demo

Configure the demo endpoint used by mdllama by running the setup flow and entering the API credentials below when prompted.

mdllama setup -p openai --openai-api-base https://ai.qincai.xyz

When prompted, provide the following values:

  • API key: sk-proxy-7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e

After setup you can run the CLI as usual, for example:

mdllama run -p openai

[!NOTE] Try asking the model to give you some markdown-formatted text, or test the web search features:

  • Give me a markdown-formatted text about the history of AI.
  • search:Python 3.13 (web search)
  • site:python.org (fetch website content)
  • websearch:What are the latest Python features? (AI-powered search)

So, try it out and see how it works!

Installation

Install using package manager (recommended, supported method)

Debian/Ubuntu Installation

  1. Add the PPA to your sources list:

    echo 'deb [trusted=yes] https://packages.qincai.xyz/debian stable main' | sudo tee /etc/apt/sources.list.d/qincai-ppa.list
    sudo apt update
    
  2. Install mdllama:

    sudo apt install python3-mdllama
    

Fedora Installation

  1. Download the latest RPM from: https://packages.qincai.xyz/fedora/

    Or, to install directly:

    sudo dnf install https://packages.qincai.xyz/fedora/mdllama-<version>.noarch.rpm
    

    Replace <version> with the latest version number.

  2. (Optional, highly recommended) To enable as a repository for updates, create /etc/yum.repos.d/qincai-ppa.repo:

    [qincai-ppa]
    name=Raymont's Personal RPMs
    baseurl=https://packages.qincai.xyz/fedora/
    enabled=1
    metadata_expire=0
    gpgcheck=0
    

    Then install with:

    sudo dnf install mdllama
    

3, Install the ollama library from pip:

pip install ollama

You can also install it globally with:

sudo pip install ollama

[!NOTE] The ollama library is not installed by default in the RPM package since there is no system ollama package avaliable (python3-ollama). You need to install it manually using pip in order to use mdllama with Ollama models. This issue has been resolved by including a post-installation script for RPM packages that automatically installs the ollama library using pip.


PyPI Installation (Cross-Platform)

Install via pip (recommended for Windows/macOS and Python virtual environments):

pip install mdllama

Traditional Bash Script Installation (Linux)

[!WARNING] This method of un-/installation is deprecated and shall be avoided Please use the pip method, or use DEB/RPM packages instead

To install mdllama using the traditional bash script, run:

bash <(curl -fsSL https://raw.githubusercontent.com/QinCai-rui/mdllama/refs/heads/main/install.sh)

To uninstall mdllama, run:

bash <(curl -fsSL https://raw.githubusercontent.com/QinCai-rui/mdllama/refs/heads/main/uninstall.sh)

License

This project is licensed under the GNU General Public License v3.0. See the LICENSE file for details.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mdllama-4.2.5.tar.gz (47.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mdllama-4.2.5-py3-none-any.whl (51.2 kB view details)

Uploaded Python 3

File details

Details for the file mdllama-4.2.5.tar.gz.

File metadata

  • Download URL: mdllama-4.2.5.tar.gz
  • Upload date:
  • Size: 47.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mdllama-4.2.5.tar.gz
Algorithm Hash digest
SHA256 bbb08bf68de8f9dba3cab0cb274845638881141f71837437a14373a1674635c8
MD5 ddc53e8463f014599029e77cdaf94993
BLAKE2b-256 d5de08389ced5bbd9fbc341e7f4333354cdc0e5878e3ea2a1d3976a4c0e8a519

See more details on using hashes here.

Provenance

The following attestation bundles were made for mdllama-4.2.5.tar.gz:

Publisher: publish-to-pypi.yml on QinCai-rui/mdllama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mdllama-4.2.5-py3-none-any.whl.

File metadata

  • Download URL: mdllama-4.2.5-py3-none-any.whl
  • Upload date:
  • Size: 51.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mdllama-4.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 f7d651f9f8680075a0e90d1e3fbe9980eac56d02ae64e78e67288be62f819900
MD5 0bcac84a54ef8b457f07359d65bab277
BLAKE2b-256 0903e2a81a66becbfeb8b7e613db0cd524f70acbfc9eb99de43ae9b994cd94ed

See more details on using hashes here.

Provenance

The following attestation bundles were made for mdllama-4.2.5-py3-none-any.whl:

Publisher: publish-to-pypi.yml on QinCai-rui/mdllama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page