Skip to main content

Simple Llama-Index model based on Ollama using Llama-Index model manager for Quackamollie Telegram chat bot

Project description

Name:

Quackamollie Model Llama-Index Simple

Package name:

quackamollie-model-llama-index-simple

Description:

Simple Llama-Index model based on Ollama using Llama-Index model manager for Quackamollie Telegram chat bot

Version:
0.1
Main page:

https://gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple

PyPI package:

https://pypi.org/project/quackamollie-model-llama-index-simple/

Docker Image:

registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple:0.1

Documentation:

https://simple-llama-index-model-forge-of-absurd-ducks-q-8cc6f6ddf9ba3f.gitlab.io

Build Status:
Master:

Master pipeline status Master coverage status

Dev:

Dev pipeline status Dev coverage status


Project description

Quackamollie is a Telegram chat bot developed in Python using the library aiogram to serve LLM models running locally using Ollama.

This package is a simple Llama Index model compatible with the Llama-Index model manager of the Quackamollie project. It contains:

  • a model SimpleLlamaIndexQuackamollieModel class implementing abstract functions of MetaLlamaIndexQuackamollieModel

Learn more about Quackamollie on the project main page.

Requirements

Virtual environment

  • Setup a virtual environment in python 3.10

make venv
# or
python3 -m venv venv
  • Activate the environment

source venv/bin/activate
  • If you want to deactivate the environment

deactivate

Tests

Tests requirements

  • Install test requirements

make devtools
# or
pip install tox

Run pytest

  • Run the tests

tox

Run lint

  • Run the lintage

tox -e lint

Documentation

  • To auto-generate the documentation configuration

tox -e gendocs
  • To generate the documentation in Html

tox -e docs
  • An automatically generated version of this project documentation can be found here

    • N.B.: This automatically generated documentation of the Quackamollie core project is still laking a lot of improvements. Sorry for the inconvenience.

Install

  • Install the application from sources

make install
# or
pip install .
  • Or install it from distribution

pip install dist/quackamollie-model-llama-index-simple-0.1.tar.gz
  • Or install it from wheel

pip install dist/quackamollie-model-llama-index-simple-0.1.whl
  • Or install it from PyPi repository

pip install quackamollie-model-llama-index-simple  # latest
# or
pip install "quackamollie-model-llama-index-simple==0.1"

Docker

  • To build the application docker

docker build --network=host -t quackamollie_model_llama_index_simple:0.1 .
  • The official Docker image of this project is available at: registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple

  • You can pull the image of the current release:

docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple:latest  # or dev
# or
docker pull registry.gitlab.com/forge_of_absurd_ducks/quackamollie/lib/models/llama_index/quackamollie_model_llama_index_simple:0.1

Running the model

quackamollie-model-llama-index-simple package is automatically discovered, through entrypoints, by the Llama-Index model manager through the command tool line named quackamollie. Therefore, once installed, you should automatically see this model in Telegram /settings command.

You should pull an Ollama model for this Llama-Index model by simply using the ollama command tool line:

ollama pull llama3

For details on how to run the Quackamollie project, please refer to the Quackamollie’s project main page.

Authors

Contributing

If you want to report a bug or ask for a new feature of quackamollie-model-llama-index-simple, please open an issue in the Gitlab ticket management section of this project. Please, first ensure that your issue is not redundant with already open issues.

If you want to contribute code to this project, please open first an issue and then a merge request with commit names referencing this issues. Note that only fast-forward merge requests are accepted.

For more details on the general contributing mindset of this project, please refer to CONTRIBUTING.md.

Credits

Section in writing, sorry for the inconvenience.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quackamollie_model_llama_index_simple-0.1.tar.gz (19.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file quackamollie_model_llama_index_simple-0.1.tar.gz.

File metadata

File hashes

Hashes for quackamollie_model_llama_index_simple-0.1.tar.gz
Algorithm Hash digest
SHA256 0bad24f5ab877f5cf784e1181dd89dee5d36e0ca80e1db9dd715ddb02c274fba
MD5 d51cdc65eee4a746279088aa20c2fac9
BLAKE2b-256 40b1b805df4fc169c71deeaae999e8e4dcc037ac6a26fe75146be6375a944a87

See more details on using hashes here.

File details

Details for the file quackamollie_model_llama_index_simple-0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for quackamollie_model_llama_index_simple-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4600980d8477280e724b73cc071a82296eeaebe578db8dcaffc018c8c71b11f0
MD5 0b4340d384a5f06c8524bfc57e2553be
BLAKE2b-256 1daa12a63326aa0df46d610e190d7f6900cafd70496badf7978c28815d6501b4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page