Skip to main content

Lightweight Orchestrated LLM Manager - Multi-provider LLM configuration and management

Project description

LOLM - Lightweight Orchestrated LLM Manager

PyPI version Python 3.9+ License: Apache 2.0 Code style: black

Reusable LLM configuration and management package for Python projects.

Multi-provider support with automatic fallback, priority routing, and unified configuration.

✨ Features

  • 🔄 Multi-provider support - OpenRouter, Ollama, Groq, Together, LiteLLM
  • Automatic fallback - Seamless provider switching on failure
  • 🎯 Priority routing - Configure provider order by priority
  • 🔧 Unified config - .env, litellm_config.yaml, ~/.lolm/
  • 🖥️ CLI interface - Similar to reclapp llm

🚀 Installation

pip install lolm

Or with optional dependencies:

pip install lolm[full]      # All providers
pip install lolm[ollama]    # Ollama support
pip install lolm[litellm]   # LiteLLM support

📖 Quick Start

CLI

# Show provider status
lolm status

# Set default provider
lolm set-provider openrouter

# Set model
lolm set-model openrouter nvidia/nemotron-3-nano-30b-a3b:free

# Manage API keys
lolm key set openrouter YOUR_API_KEY

# Test generation
lolm test

Python API

from lolm import get_client, LLMManager

# Simple usage
client = get_client()
response = client.generate("Explain this code")

# With specific provider
client = get_client(provider='openrouter')
response = client.generate("Hello!", system="You are helpful")

# With manager for fallback
manager = LLMManager()
manager.initialize()

response = manager.generate_with_fallback(
    "Generate code",
    providers=['openrouter', 'groq', 'ollama']
)

⚙️ Configuration

Environment Variables (.env)

# API Keys
OPENROUTER_API_KEY=sk-or-v1-...
GROQ_API_KEY=gsk_...
TOGETHER_API_KEY=...

# Default provider
LLM_PROVIDER=auto

# Model overrides
OPENROUTER_MODEL=nvidia/nemotron-3-nano-30b-a3b:free
OLLAMA_MODEL=qwen2.5-coder:14b

litellm_config.yaml

model_list:
  - model_name: code-analyzer
    litellm_params:
      model: ollama/qwen2.5-coder:7b
      api_base: http://localhost:11434
    priority: 10

router_settings:
  routing_strategy: simple-shuffle
  num_retries: 3

🖥️ CLI Reference

Command Description
lolm status Show provider status
lolm set-provider PROVIDER Set default provider
lolm set-model PROVIDER MODEL Set model for provider
lolm key set PROVIDER KEY Set API key
lolm key show Show configured keys
lolm models [PROVIDER] List recommended models
lolm test [--provider P] Test LLM generation
lolm config show Show configuration
lolm priority set-provider P N Set provider priority
lolm priority set-mode MODE Set priority mode

🔌 Supported Providers

Provider Type Free Tier Default Model
OpenRouter Cloud nvidia/nemotron-3-nano-30b-a3b:free
Ollama Local qwen2.5-coder:14b
Groq Cloud llama-3.1-70b-versatile
Together Cloud - Qwen/Qwen2.5-Coder-32B-Instruct
LiteLLM Universal - gpt-4

🧰 Monorepo (code2logic) workflow

If you use lolm inside the code2logic monorepo, you can manage all packages from the repository root:

make test-all
make build-subpackages
make publish-all

See: docs/19-monorepo-workflow.md.

🧪 Development

# Install dev dependencies
make install-dev

# Run tests
make test

# Format code
make format

# Lint
make lint

# Build package
make build

# Publish to PyPI
make publish

📄 License

Apache 2.0 License - see LICENSE for details.

🔗 Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lolm-0.1.10.tar.gz (4.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lolm-0.1.10-py3-none-any.whl (3.6 kB view details)

Uploaded Python 3

File details

Details for the file lolm-0.1.10.tar.gz.

File metadata

  • Download URL: lolm-0.1.10.tar.gz
  • Upload date:
  • Size: 4.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for lolm-0.1.10.tar.gz
Algorithm Hash digest
SHA256 9973a6a8faea165999f8b8be9505aec9582367c42f98ed82f4e8627130f2d1b0
MD5 ae6709fd2d52c213d45e78d555f3308f
BLAKE2b-256 870f4d58e54b58087bc6b1d1487c91b900b15886d01e46996865db502d966099

See more details on using hashes here.

File details

Details for the file lolm-0.1.10-py3-none-any.whl.

File metadata

  • Download URL: lolm-0.1.10-py3-none-any.whl
  • Upload date:
  • Size: 3.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for lolm-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 6afb7beb8f0f66ca3560bef707d5cff1adc2b4b90b3946b0dab65f9f3f56508f
MD5 bdeab04ff38edd8f62477c11e2d4f248
BLAKE2b-256 8e1c651e33b2a1ff9c732762e112325a73139f26d3b186262bfea11027084a7d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page