Skip to main content

Simple Llama-Index model based on Ollama using Llama-Index model manager for Quackamollie Telegram chat bot

Project description

Name:

Quackamollie Model Llama-Index Simple

Package name:

quackamollie-model-llama-index-simple

Description:

Simple Llama-Index model based on Ollama using Llama-Index model manager for Quackamollie Telegram chat bot

Version:
0.1
Main page:

https://gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple

PyPI package:

https://pypi.org/project/quackamollie-model-llama-index-simple/

Docker Image:

registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple:0.1

Documentation:

https://simple-llama-index-model-forge-of-absurd-ducks-q-8cc6f6ddf9ba3f.gitlab.io

Build Status:
Master:

Master pipeline status Master coverage status

Dev:

Dev pipeline status Dev coverage status


Project description

Quackamollie is a Telegram chat bot developed in Python using the library aiogram to serve LLM models running locally using Ollama.

This package is a simple Llama Index model compatible with the Llama-Index model manager of the Quackamollie project. It contains:

  • a model SimpleLlamaIndexQuackamollieModel class implementing abstract functions of MetaLlamaIndexQuackamollieModel

Learn more about Quackamollie on the project main page.

Requirements

Virtual environment

  • Setup a virtual environment in python 3.10

make venv
# or
python3 -m venv venv
  • Activate the environment

source venv/bin/activate
  • If you want to deactivate the environment

deactivate

Tests

Tests requirements

  • Install test requirements

make devtools
# or
pip install tox

Run pytest

  • Run the tests

tox

Run lint

  • Run the lintage

tox -e lint

Documentation

  • To auto-generate the documentation configuration

tox -e gendocs
  • To generate the documentation in Html

tox -e docs
  • An automatically generated version of this project documentation can be found here

    • N.B.: This automatically generated documentation of the Quackamollie core project is still laking a lot of improvements. Sorry for the inconvenience.

Install

  • Install the application from sources

make install
# or
pip install .
  • Or install it from distribution

pip install dist/quackamollie-model-llama-index-simple-0.1.tar.gz
  • Or install it from wheel

pip install dist/quackamollie-model-llama-index-simple-0.1.whl
  • Or install it from PyPi repository

pip install quackamollie-model-llama-index-simple  # latest
# or
pip install "quackamollie-model-llama-index-simple==0.1"

Docker

  • To build the application docker

docker build --network=host -t quackamollie_model_llama_index_simple:0.1 .
  • The official Docker image of this project is available at: registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple

  • You can pull the image of the current release:

docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple:latest  # or dev
# or
docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple:0.1

Running the model

quackamollie-model-llama-index-simple package is automatically discovered, through entrypoints, by the Llama-Index model manager through the command tool line named quackamollie. Therefore, once installed, you should automatically see this model in Telegram /settings command.

You should pull an Ollama model for this Llama-Index model by simply using the ollama command tool line:

ollama pull llama3

For details on how to run the Quackamollie project, please refer to the Quackamollie’s project main page.

Authors

Contributing

If you want to report a bug or ask for a new feature of quackamollie-model-llama-index-simple, please open an issue in the Gitlab ticket management section of this project. Please, first ensure that your issue is not redundant with already open issues.

If you want to contribute code to this project, please open first an issue and then a merge request with commit names referencing this issues. Note that only fast-forward merge requests are accepted.

For more details on the general contributing mindset of this project, please refer to CONTRIBUTING.md.

Credits

Section in writing, sorry for the inconvenience.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page