Skip to main content

A library with which to prototype LLM-based applications quickly and easily.

Project description

ProtoLLM

license

Licence for repo

support

Telegram Chat

languages

eng rus

mirror

GitLab mirror for this repository

funding

Acknowledgement to ITMO Acknowledgement to SAI

Intro

Proto-LLM is an open-source framework for fast protyping of LLM-based applications.

Proto LLM features

  • Rapid prototyping of information retrieval systems based on LLM using RAG:

    Implementations of architectural patterns for interacting with different databases and web service interfaces; Methods for optimising RAG pipelines to eliminate redundancy.

  • Development and integration of applications with LLM with connection of external services and models through plugin system:

    Integration with AutoML solutions for predictive tasks; Providing structured output generation and validation;

  • Implementation of ensemble methods and multi-agent approaches to improve the efficiency of LLMs:

    Possibility of combining arbitrary LLMs into ensembles to improve generation quality, automatic selection of ensemble composition; Work with model-agents and ensemble pipelines;

  • Generation of complex synthetic data for further training and improvement of LLM:

    Generating examples from existing models and data sets; Evolutionary optimisation to increase the diversity of examples; Integration with Label Studio;

  • Providing interoperability with various LLM providers:

    Support for native models (GigaChat, YandexGPT, vsegpt, etc.). Interaction with open-source models deployed locally.

Project Structure

The latest stable release of ProtoLLM is in the main branch.

The repository includes the following directories:

  • Package protollm contains the main modules. It is the core of the ProtoLLM framework;

  • Package protollm_tools contains side tools with specific dependensied;

  • Package examples includes several how-to-use-cases where you can start to discover how ProtoLLM works;

  • All unit and integration tests can be observed in the test directory;

  • The sources of the documentation are in the docs directory.

Installation

  • The simplest way to install ProtoLLM is using pip:

$ pip install protollm

A standard installation of ProtoLLM includes the main package with dependencies and protollm-sdk from protollm_tools

  • Installation with extras:

$ pip install protollm[api-tools]

When installing with api-tools extras, protollm-worker and protollm-api protollm-api are additionally installed

  • Modules with tools can be installed separately:

$ pip install protollm-worker

$ pip install protollm-api

$ pip install protollm-sdk

Contribution Guide

  • The contribution guide is available in this repository.

Acknowledgments

We acknowledge the contributors for their important impact and the participants of the numerous scientific conferences and workshops for their valuable advice and suggestions.

Supported by

The study is supported by the Research Center Strong Artificial Intelligence in Industry of ITMO University as part of the plan of the center’s program “Framework for rapid application prototyping based on large language models”.

Contacts

Papers about ProtoLLM-based solutions:

  • Kalyuzhnaya A. et al. LLM Agents for Smart City Management: Enhancing Decision Support Through Multi-Agent AI Systems //Smart Cities. – 2025. – Т. 8. – №. 1. – С. 19.

  • Zakharov K. et al. Forecasting Population Migration in Small Settlements Using Generative Models under Conditions of Data Scarcity //Smart Cities. – 2024. – Т. 7. – №. 5. – С. 2495-2513.

  • Kovalchuk M. A. et al. SemConvTree: Semantic Convolutional Quadtrees for Multi-Scale Event Detection in Smart City //Smart Cities. – 2024. – Т. 7. – №. 5. – С. 2763-2780.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

protollm-0.1.2.tar.gz (89.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

protollm-0.1.2-py3-none-any.whl (126.0 kB view details)

Uploaded Python 3

File details

Details for the file protollm-0.1.2.tar.gz.

File metadata

  • Download URL: protollm-0.1.2.tar.gz
  • Upload date:
  • Size: 89.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.10.16 Linux/6.8.0-1021-azure

File hashes

Hashes for protollm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 9b0cc5b0cb3a20b46ed59d9630a88d3dea66c88f6886708144b15917a383603b
MD5 5304ef397fd2994fa76b53b818db96eb
BLAKE2b-256 c45ae7420f49e00d3ccf1e07e4f32f2184c72756254281bc570552a2b1b200f8

See more details on using hashes here.

File details

Details for the file protollm-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: protollm-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 126.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.10.16 Linux/6.8.0-1021-azure

File hashes

Hashes for protollm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d6e32403a74f2090e2d753ec8e8b1f5cc90ff809feeb02a8262fbc13c8b71c3c
MD5 ca1cc159188e577cd4c1bc9d4830aa4f
BLAKE2b-256 9cc65b272c52e71b50001aa22db571ac0620c49128e8c536b27992c95e239960

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page