Skip to main content

Minimalist Library for programming with LLMs

Project description

MiniLLMLib

GitHub stars GitHub forks GitHub issues GitHub last commit

PyPI version Docs License: MIT Python


Installation

pip install minillmlib
# For HuggingFace/local models: (Beta - not well tested)
pip install minillmlib[huggingface]

A Python library for interacting with various LLM providers (OpenAI, Anthropic, Mistral, HuggingFace, through URL).

Author: Quentin Feuillade--Montixi

Installation

From Source

git clone https://github.com/qfeuilla/MiniLLMLib.git
cd MiniLLMLib
pip install -e .  # Install in editable mode

Usage

import minillmlib as mll

# Create a GeneratorInfo for your model/provider
import os

gi = mll.GeneratorInfo(
    model="gpt-4",
    _format="openai",
    api_key=os.getenv("OPENAI_API_KEY")  # Recommended: use env var for secrets
)

# Create a chat node (conversation root)
chat = mll.ChatNode(content="Hello!", role="user")

# Synchronous completion
response = chat.complete_one(gi)
print(response.content)

# Or asynchronous version
# response = await chat.complete_one_async(gi)

Features

  • Unified interface for major LLM providers:
    • OpenAI, Anthropic, Mistral, HuggingFace (local), custom URL (e.g. OpenRouter)
  • Thread (linear) and loom (tree/branching) conversation modes
  • Synchronous & asynchronous API
  • Audio completions (OpenAI audio models, beta)
  • Flexible parameter/config management via GeneratorInfo and GeneratorCompletionParameters
  • Save/load conversation trees
  • Extensible: add new models/providers easily

Documentation

  • See the Usage Guide for advanced usage, parameter tables, and branching/loom semantics.
  • See the Provider Matrix for supported models and configuration tips.
  • See Troubleshooting for common issues and debugging.

Configuration

Development & Contribution

  • Run tests with:
    pytest tests/
    
  • See Contributing for contribution guidelines.

For more, see the full documentation at minillmlib.readthedocs.io or open an issue on GitHub if you need help.


Release Tagging Reminder

(for maintainers use)

To push a new release tag:

git add <files you changed>
git commit -m "<your message>"
git tag v<NEW_VERSION> -m "Release v<NEW_VERSION>: <short description>"
git push origin main --tags

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

minillmlib-0.3.0.tar.gz (92.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

minillmlib-0.3.0-py3-none-any.whl (28.9 kB view details)

Uploaded Python 3

File details

Details for the file minillmlib-0.3.0.tar.gz.

File metadata

  • Download URL: minillmlib-0.3.0.tar.gz
  • Upload date:
  • Size: 92.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for minillmlib-0.3.0.tar.gz
Algorithm Hash digest
SHA256 5b0ba340d96ebd4eaa940705b2ef55efdfc7fed9d6a6e06a5c21e756e7553a2e
MD5 cb64501afc1aa8eba6dbae4d68274b39
BLAKE2b-256 52d18a07bb05d65f735650e10f17ae7f0d4c7e798c562ad7d97872e8e753756f

See more details on using hashes here.

File details

Details for the file minillmlib-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: minillmlib-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 28.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for minillmlib-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 aa7a095fad994b14a35d8eb2dd53b161c285232f46865be8a9ec0f6999862a1b
MD5 0b20c825a081cc9cbc6a980c7f45afef
BLAKE2b-256 63e06c69f20340ff929ffaca08c5c6e3ebb484b210f72e76cc70270645571152

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page