Skip to main content

Model manager compatible with Ollama models through Ollama API for Quackamollie Telegram chat bot

Project description

Name:

Quackamollie Ollama Model Manager

Package name:

quackamollie-ollama-model-manager

Description:

Model manager compatible with Ollama models through Ollama API for Quackamollie Telegram chat bot

Version:
0.1
Main page:

https://gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_ollama_model_manager

PyPI package:

https://pypi.org/project/quackamollie-ollama-model-manager/

Docker Image:

registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_ollama_model_manager:0.1

Documentation:

https://ollama-model-manager-forge-of-absurd-ducks-quack-4019535ed9bff7.gitlab.io/

Build Status:
Master:

Master pipeline status Master coverage status

Dev:

Dev pipeline status Dev coverage status


Project description

Quackamollie is a Telegram chat bot developed in Python using the library aiogram to serve LLM models running locally using Ollama.

This package is a model manager exposing Ollama models for the Quackamollie project. It contains:

  • a model manager OllamaQuackamollieModelManager class implementing abstract functions of MetaQuackamollieModelManager

Learn more about Quackamollie on the project main page.

Requirements

Virtual environment

  • Setup a virtual environment in python 3.10

make venv
# or
python3 -m venv venv
  • Activate the environment

source venv/bin/activate
  • If you want to deactivate the environment

deactivate

Tests

Tests requirements

  • Install test requirements

make devtools
# or
pip install tox

Run pytest

  • Run the tests

tox

Run lint

  • Run the lintage

tox -e lint

Documentation

  • To auto-generate the documentation configuration

tox -e gendocs
  • To generate the documentation in Html

tox -e docs
  • An automatically generated version of this project documentation can be found here

    • N.B.: This automatically generated documentation of the Quackamollie core project is still laking a lot of improvements. Sorry for the inconvenience.

Install

  • Install the application from sources

make install
# or
pip install .
  • Or install it from distribution

pip install dist/quackamollie-ollama-model-manager-0.1.tar.gz
  • Or install it from wheel

pip install dist/quackamollie-ollama-model-manager-0.1.whl
  • Or install it from PyPi repository

pip install quackamollie-ollama-model-manager  # latest
# or
pip install "quackamollie-ollama-model-manager==0.1"

Docker

  • To build the application docker

docker build --network=host -t quackamollie_ollama_model_manager:0.1 .
  • The official Docker image of this project is available at: registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_ollama_model_manager

  • You can pull the image of the current release:

docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_ollama_model_manager:latest  # or dev
# or
docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_ollama_model_manager:0.1

Running the model manager

quackamollie-ollama-model-manager package is automatically discovered, through entrypoints, by the command tool line named quackamollie. Therefore, once installed, you should automatically see models managed by this model manager in Telegram /settings command.

You can install models for this model manager by simply pulling them using the ollama command:

ollama pull llama3

For details on how to run the Quackamollie project, please refer to the Quackamollie’s project main page.

Authors

Contributing

If you want to report a bug or ask for a new feature of quackamollie-ollama-model-manager, please open an issue in the Gitlab ticket management section of this project. Please, first ensure that your issue is not redundant with already open issues.

If you want to contribute code to this project, please open first an issue and then a merge request with commit names referencing this issues. Note that only fast-forward merge requests are accepted.

For more details on the general contributing mindset of this project, please refer to CONTRIBUTING.md.

Credits

Section in writing, sorry for the inconvenience.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quackamollie_ollama_model_manager-0.1.tar.gz (19.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file quackamollie_ollama_model_manager-0.1.tar.gz.

File metadata

File hashes

Hashes for quackamollie_ollama_model_manager-0.1.tar.gz
Algorithm Hash digest
SHA256 44b37cdbd90e535f6a59a5ddee4191e31bf7b8b5c21889b27cd28a30f1ab0699
MD5 1ac31d352e474cbff637ddfc41fe415f
BLAKE2b-256 31ed330c630baf92a2217b0ff8db9aaf0a7e22255eb13f544fa21c13509c39d6

See more details on using hashes here.

File details

Details for the file quackamollie_ollama_model_manager-0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for quackamollie_ollama_model_manager-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c75e63bd8308d23b5742c27a3535419f5dd7c63dfb6f9810578df40f058a75da
MD5 6bdb4c2a45567f8ad18bba5dd6466555
BLAKE2b-256 6e4fc020b296bc762152813629b25d9755ab27a100dd4207c2fb3d12b2e54a52

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page