Skip to main content

A command-line interface for Ollama API

Project description

mdllama

Build and Publish mdllama DEB and RPM

Publish to PyPI on mdllama.py Update

PPA development (GH Pages)

A CLI tool that lets you chat with Ollama models right from your terminal, with built-in Markdown rendering.

mdllama makes it easy to interact with Ollama's AI models directly from your command line, meanwhile providing you with real-time Markdown rendering

Features

  • Chat with Ollama models from the terminal
  • Built-in Markdown rendering
  • Simple installation and removal (see below)

Screenshots

Chat Interface

Chat

Help

Help

Demo

Note: If the video does not play, you can download it here.

Installation

Install using package manager (recommended)

Debian/Ubuntu Installation

  1. Add the PPA to your sources list:

    echo 'deb [trusted=yes] https://packages.qincai.xyz/debian stable main' | sudo tee /etc/apt/sources.list.d/qincai-mdllama.list
    sudo apt update
    
  2. Install mdllama:

    sudo apt install python3-mdllama
    

Fedora Installation

  1. Download the latest RPM from: https://packages.qincai.xyz/fedora/

    Or, to install directly:

    sudo dnf install https://packages.qincai.xyz/fedora/mdllama-<version>.noarch.rpm
    

    Replace <version> with the latest version number.

  2. (Optional, highly recommended) To enable as a repository for updates, create /etc/yum.repos.d/qincai-mdllama.repo:

    [qincai-mdllama]
    name=Raymont's Personal RPMs
    baseurl=https://packages.qincai.xyz/fedora/
    enabled=1
    metadata_expire=0
    gpgcheck=0
    

    Then install with:

    sudo dnf install mdllama
    

3, Install the ollama library from pip:

pip install ollama

You can also install it globally with:

sudo pip install ollama

[!NOTE] The ollama library is not installed by default in the RPM package since there is no system ollama package avaliable (python3-ollama). You need to install it manually using pip in order to use mdllama with Ollama models.


Traditional Bash Script Installation (Linux)

To install mdllama using the traditional bash script, run:

bash <(curl -fsSL https://raw.githubusercontent.com/QinCai-rui/mdllama/refs/heads/main/install.sh)

To uninstall mdllama, run:

bash <(curl -fsSL https://raw.githubusercontent.com/QinCai-rui/mdllama/refs/heads/main/uninstall.sh)

Windows & macOS Installation

Install via pip (recommended for Windows/macOS):

pip install mdllama

License

This project is licensed under the GNU General Public License v3.0. See the LICENSE file for details.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mdllama-2.2.4.tar.gz (54.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mdllama-2.2.4-py3-none-any.whl (41.7 kB view details)

Uploaded Python 3

File details

Details for the file mdllama-2.2.4.tar.gz.

File metadata

  • Download URL: mdllama-2.2.4.tar.gz
  • Upload date:
  • Size: 54.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mdllama-2.2.4.tar.gz
Algorithm Hash digest
SHA256 4cc9a3d4e09b698c2dad3e1e3bc0edf7b581376ed05fef272cb4f34ddac8e9b4
MD5 c183b7263c6e5548c031f8ce490b7159
BLAKE2b-256 1b32c78e3f53c2a92eda90c7431b4bc724137910928a0a86e6998216c6b8423e

See more details on using hashes here.

Provenance

The following attestation bundles were made for mdllama-2.2.4.tar.gz:

Publisher: publish-to-pypi.yml on QinCai-rui/mdllama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mdllama-2.2.4-py3-none-any.whl.

File metadata

  • Download URL: mdllama-2.2.4-py3-none-any.whl
  • Upload date:
  • Size: 41.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mdllama-2.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c8a0e7413040a39a4b453f22067aafefbe6967011b40a83b3469d14feb65ebf3
MD5 1541cb7496d1d51ababad49394c98424
BLAKE2b-256 4da8645b5b63821e957b94ba05c8a1e2ec0b706f296cf0e019c1e3a2c49af8b8

See more details on using hashes here.

Provenance

The following attestation bundles were made for mdllama-2.2.4-py3-none-any.whl:

Publisher: publish-to-pypi.yml on QinCai-rui/mdllama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page