Matrix chatbot powered by Ollama
Project description
ollamarama-matrix
Ollamarama is a powerful AI chatbot for the Matrix chat protocol powered by the Ollama Chat API. Transform your Matrix rooms with an AI that can roleplay as virtually anything you can imagine — privately and locally.
Documentation
- Overview
- Getting Started
- Ollama Setup
- Configuration
- Commands
- Tools & MCP
- Docker
- CLI Reference
- Operations & E2E
- Architecture
- Development
- Migration
- Legacy Map
- Security
- AI Output Disclaimer
Features
- Dynamic personalities with quick switching
- Per‑user history, isolated per room and user
- Collaborative mode to talk across histories
- Admin controls for model switching and resets
- Custom system prompts for specialized tasks
Related Projects
- IRC version: https://github.com/h1ddenpr0cess20/ollamarama-irc
- CLI version: https://github.com/h1ddenpr0cess20/ollamarama
Installation
Options depending on how you prefer to run it:
- From source (installs CLI):
- Clone this repo, then run:
pip install . - Or use pipx for isolation:
pipx install .
- Clone this repo, then run:
- From source without installing the package:
pip install -r requirements.txt- Run with:
python -m ollamarama --config config.json
After installation, use the ollamarama-matrix command. For E2E encryption, ensure libolm is installed; see Operations & E2E.
Quick Start
Prerequisites
Install and familiarize yourself with Ollama to run local LLMs.
curl -fsSL https://ollama.com/install.sh | sh
Pull at least one model (recommended):
ollama pull qwen3
For deeper setup and verification steps, see Ollama Setup.
1) Install dependencies
pip install -r requirements.txt
2) Configure
Create or edit config.json at the repo root. See Configuration for a minimal example, full schema, and validation guidance.
3) Run
Preferred (installed command):
ollamarama-matrix --config config.json
Fetch models from the server (ignores ollama.models in config):
ollamarama-matrix --config config.json --server-models
Short form:
ollamarama-matrix -S --config config.json
Alternatively, run as a module:
python -m ollamarama --config config.json
4) Try It
- The bot logs in and joins configured rooms
- Send
.ai helloorBotName: helloin a joined room - The bot replies and maintains per‑user history
Usage Guide
Common commands (see Commands for the full list):
| Command | Description | Example |
|---|---|---|
.ai <message> or botname: <message> |
Chat with the AI | .ai Hello there! |
.x <user> <message> |
Continue another user's conversation | .x Alice What did we discuss? |
.persona <text> |
Change your personality | .persona helpful librarian |
.custom <prompt> |
Use a custom system prompt | .custom You are a coding expert |
.reset / .stock |
Clear history (default/stock prompt) | .reset |
.model [name] (admin) |
Show/change model | .model qwen3 |
.clear (admin) |
Reset globally for all users | .clear |
.help |
Show inline help | .help |
.verbose [on,off,toggle] (admin) |
Control inclusion of brevity clause for new conversations | .verbose on |
.thinking [on,off,toggle] (admin) |
Show or hide the thinking placeholder while generating | .thinking off |
Encryption Support
- Works in encrypted Matrix rooms using
matrix-nio[e2e]with device verification. - Requires
libolmavailable to Python for E2E. If unavailable, you can run without E2E; see Getting Started (Install Dependencies). - Persist the
store/directory to retain device keys and encryption state.
Community & Policies
License
AGPL‑3.0 — see LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ollamarama_matrix-1.3.1.tar.gz.
File metadata
- Download URL: ollamarama_matrix-1.3.1.tar.gz
- Upload date:
- Size: 82.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
04f3c7a02f3f75250f941e290a02e6405fcc51fb81b30e803cb9b5ee65237db4
|
|
| MD5 |
aca8f751699648270cba319f94d289e1
|
|
| BLAKE2b-256 |
64f984a89d969e3ad4f1cc76ffdf5000e129db17e93d916e6ab7f3be050a7f56
|
Provenance
The following attestation bundles were made for ollamarama_matrix-1.3.1.tar.gz:
Publisher:
pypi-publish.yml on h1ddenpr0cess20/ollamarama-matrix
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ollamarama_matrix-1.3.1.tar.gz -
Subject digest:
04f3c7a02f3f75250f941e290a02e6405fcc51fb81b30e803cb9b5ee65237db4 - Sigstore transparency entry: 1422178056
- Sigstore integration time:
-
Permalink:
h1ddenpr0cess20/ollamarama-matrix@bd1c66f1bd785d24348297aa3afc949dfab80bbd -
Branch / Tag:
refs/tags/v1.3.1 - Owner: https://github.com/h1ddenpr0cess20
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@bd1c66f1bd785d24348297aa3afc949dfab80bbd -
Trigger Event:
release
-
Statement type:
File details
Details for the file ollamarama_matrix-1.3.1-py3-none-any.whl.
File metadata
- Download URL: ollamarama_matrix-1.3.1-py3-none-any.whl
- Upload date:
- Size: 73.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7efac70bc31cde79aef8383723a28fb76542debe599198acc281c4ef91b13bdb
|
|
| MD5 |
3c524ccc91a402d947cb9629d85d62c7
|
|
| BLAKE2b-256 |
de68e231b0de60d6038d67e7aa9b461c741980d51c739576f86d6889a52b5066
|
Provenance
The following attestation bundles were made for ollamarama_matrix-1.3.1-py3-none-any.whl:
Publisher:
pypi-publish.yml on h1ddenpr0cess20/ollamarama-matrix
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ollamarama_matrix-1.3.1-py3-none-any.whl -
Subject digest:
7efac70bc31cde79aef8383723a28fb76542debe599198acc281c4ef91b13bdb - Sigstore transparency entry: 1422178156
- Sigstore integration time:
-
Permalink:
h1ddenpr0cess20/ollamarama-matrix@bd1c66f1bd785d24348297aa3afc949dfab80bbd -
Branch / Tag:
refs/tags/v1.3.1 - Owner: https://github.com/h1ddenpr0cess20
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@bd1c66f1bd785d24348297aa3afc949dfab80bbd -
Trigger Event:
release
-
Statement type: