Skip to main content

Matrix chatbot powered by Ollama

Project description

ollamarama-matrix

License: AGPL v3 Python 3.8+ Matrix Protocol Ollama GitHub

Ollamarama is a powerful AI chatbot for the Matrix chat protocol powered by the Ollama Chat API. Transform your Matrix rooms with an AI that can roleplay as virtually anything you can imagine — privately and locally.

Documentation

Features

  • Dynamic personalities with quick switching
  • Per‑user history, isolated per room and user
  • Collaborative mode to talk across histories
  • Admin controls for model switching and resets
  • Custom system prompts for specialized tasks

Related Projects

Installation

Options depending on how you prefer to run it:

  • From source (installs CLI):
    • Clone this repo, then run: pip install .
    • Or use pipx for isolation: pipx install .
  • From source without installing the package:
    • pip install -r requirements.txt
    • Run with: python -m ollamarama --config config.json

After installation, use the ollamarama-matrix command. For E2E encryption, ensure libolm is installed; see Operations & E2E.

Quick Start

Prerequisites

Install and familiarize yourself with Ollama to run local LLMs.

curl -fsSL https://ollama.com/install.sh | sh

Pull at least one model (recommended):

ollama pull qwen3

For deeper setup and verification steps, see Ollama Setup.

1) Install dependencies

pip install -r requirements.txt

2) Configure

Create or edit config.json at the repo root. See Configuration for a minimal example, full schema, and validation guidance.

3) Run

Preferred (installed command):

ollamarama-matrix --config config.json

Fetch models from the server (ignores ollama.models in config):

ollamarama-matrix --config config.json --server-models

Short form:

ollamarama-matrix -S --config config.json

Alternatively, run as a module:

python -m ollamarama --config config.json

4) Try It

  • The bot logs in and joins configured rooms
  • Send .ai hello or BotName: hello in a joined room
  • The bot replies and maintains per‑user history

Usage Guide

Common commands (see Commands for the full list):

Command Description Example
.ai <message> or botname: <message> Chat with the AI .ai Hello there!
.x <user> <message> Continue another user's conversation .x Alice What did we discuss?
.persona <text> Change your personality .persona helpful librarian
.custom <prompt> Use a custom system prompt .custom You are a coding expert
.reset / .stock Clear history (default/stock prompt) .reset
.model [name] (admin) Show/change model .model qwen3
.clear (admin) Reset globally for all users .clear
.help Show inline help .help
.verbose [on,off,toggle] (admin) Control inclusion of brevity clause for new conversations .verbose on
.thinking [on,off,toggle] (admin) Show or hide the thinking placeholder while generating .thinking off

Encryption Support

  • Works in encrypted Matrix rooms using matrix-nio[e2e] with device verification.
  • Requires libolm available to Python for E2E. If unavailable, you can run without E2E; see Getting Started (Install Dependencies).
  • Persist the store/ directory to retain device keys and encryption state.

Community & Policies

License

AGPL‑3.0 — see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollamarama_matrix-1.3.1.tar.gz (82.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ollamarama_matrix-1.3.1-py3-none-any.whl (73.5 kB view details)

Uploaded Python 3

File details

Details for the file ollamarama_matrix-1.3.1.tar.gz.

File metadata

  • Download URL: ollamarama_matrix-1.3.1.tar.gz
  • Upload date:
  • Size: 82.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ollamarama_matrix-1.3.1.tar.gz
Algorithm Hash digest
SHA256 04f3c7a02f3f75250f941e290a02e6405fcc51fb81b30e803cb9b5ee65237db4
MD5 aca8f751699648270cba319f94d289e1
BLAKE2b-256 64f984a89d969e3ad4f1cc76ffdf5000e129db17e93d916e6ab7f3be050a7f56

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollamarama_matrix-1.3.1.tar.gz:

Publisher: pypi-publish.yml on h1ddenpr0cess20/ollamarama-matrix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ollamarama_matrix-1.3.1-py3-none-any.whl.

File metadata

File hashes

Hashes for ollamarama_matrix-1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7efac70bc31cde79aef8383723a28fb76542debe599198acc281c4ef91b13bdb
MD5 3c524ccc91a402d947cb9629d85d62c7
BLAKE2b-256 de68e231b0de60d6038d67e7aa9b461c741980d51c739576f86d6889a52b5066

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollamarama_matrix-1.3.1-py3-none-any.whl:

Publisher: pypi-publish.yml on h1ddenpr0cess20/ollamarama-matrix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page