Skip to main content

Model manager compatible with Llama-Index models for Quackamollie Telegram chat bot

Project description

Name:

Quackamollie Llama Index Model Manager

Package name:

quackamollie-llama-index-model-manager

Description:

Model manager compatible with Llama-Index models for Quackamollie Telegram chat bot

Version:
0.1
Main page:

https://gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_llama_index_model_manager

PyPI package:

https://pypi.org/project/quackamollie-llama-index-model-manager/

Docker Image:

registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_llama_index_model_manager:0.1

Documentation:

https://llama-index-model-manager-forge-of-absurd-ducks–727628676f6413.gitlab.io/

Build Status:
Master:

Master pipeline status Master coverage status

Dev:

Dev pipeline status Dev coverage status


Project description

Quackamollie is a Telegram chat bot developed in Python using the library aiogram to serve LLM models running locally using Ollama.

This package is a model manager exposing Llama Index models for the Quackamollie project. It contains:

  • a model manager LlamaIndexQuackamollieModelManager class implementing abstract functions of MetaQuackamollieModelManager

Learn more about Quackamollie on the project main page.

Requirements

Virtual environment

  • Setup a virtual environment in python 3.10

make venv
# or
python3 -m venv venv
  • Activate the environment

source venv/bin/activate
  • If you want to deactivate the environment

deactivate

Tests

Tests requirements

  • Install test requirements

make devtools
# or
pip install tox

Run pytest

  • Run the tests

tox

Run lint

  • Run the lintage

tox -e lint

Documentation

  • To auto-generate the documentation configuration

tox -e gendocs
  • To generate the documentation in Html

tox -e docs
  • An automatically generated version of this project documentation can be found here

    • N.B.: This automatically generated documentation of the Quackamollie core project is still laking a lot of improvements. Sorry for the inconvenience.

Install

  • Install the application from sources

make install
# or
pip install .
  • Or install it from distribution

pip install dist/quackamollie-llama-index-model-manager-0.1.tar.gz
  • Or install it from wheel

pip install dist/quackamollie-llama-index-model-manager-0.1.whl
  • Or install it from PyPi repository

pip install quackamollie-llama-index-model-manager  # latest
# or
pip install "quackamollie-llama-index-model-manager==0.1"

Docker

  • To build the application docker

docker build --network=host -t quackamollie_llama_index_model_manager:0.1 .
  • The official Docker image of this project is available at: registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_llama_index_model_manager

  • You can pull the image of the current release:

docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_llama_index_model_manager:latest  # or dev
# or
docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/model_managers/quackamollie_llama_index_model_manager:0.1

Running the model manager

quackamollie-llama-index-model-manager package is automatically discovered, through entrypoints, by the command tool line named quackamollie. Therefore, once installed, you should automatically see models managed by this model manager in Telegram /settings command.

You can install models for this model manager by simply pulling them using the ollama command:

ollama pull llama3

For details on how to run the Quackamollie project, please refer to the Quackamollie’s project main page.

Authors

Contributing

If you want to report a bug or ask for a new feature of quackamollie-llama-index-model-manager, please open an issue in the Gitlab ticket management section of this project. Please, first ensure that your issue is not redundant with already open issues.

If you want to contribute code to this project, please open first an issue and then a merge request with commit names referencing this issues. Note that only fast-forward merge requests are accepted.

For more details on the general contributing mindset of this project, please refer to CONTRIBUTING.md.

Credits

Section in writing, sorry for the inconvenience.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file quackamollie_llama_index_model_manager-0.1.tar.gz.

File metadata

File hashes

Hashes for quackamollie_llama_index_model_manager-0.1.tar.gz
Algorithm Hash digest
SHA256 5a5175bfe1f3b9fc2e7efdfcc55b745255c6f5d6e355638b2be5cecbb39cf981
MD5 3f306b8cf70839bfa12e211b7e002311
BLAKE2b-256 22f7f85dbdec4a1ee4e5420173c2485d640f969229714c6e646183073fad1a1b

See more details on using hashes here.

Provenance

File details

Details for the file quackamollie_llama_index_model_manager-0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for quackamollie_llama_index_model_manager-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d62c2c4827ebcf1afcbe242a0b804c5a20e9c54bda53daebd271046a3d595d25
MD5 296807cb8fc61a3bb991c4c851c9e78f
BLAKE2b-256 d87e086ec90587a501f193aee9ca0636f195a93151bd70c85f82771956059f4e

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page