Skip to main content

No project description provided

Project description

L2M2: Simple LLM Manager for Python

L2M2 ("LLM Manager" → "LLMM" → "L2M2") is a very simple LLM manager for Python.

Supported Models

L2M2 currently supports the following models:

Provider Model Name Model Version
openai gpt-4-turbo gpt-4-0125-preview
google gemini-1.5-pro gemini-1.5-pro-latest
google gemini-1.0-pro gemini-1.0-pro-latest
anthropic claude-3-opus claude-3-opus-20240229
anthropic claude-3-sonnet claude-3-sonnet-20240229
anthropic claude-3-haiku claude-3-haiku-20240307
cohere command-r command-r
cohere command-r-plus command-r-plus
groq llama2-70b llama2-70b-4096
groq mixtral-8x7b mixtral-8x7b-32768
groq gemma-7b gemma-7b-it

Installation

pip install l2m2

Usage

Import the LLM Client

from l2m2 import LLMClient

llms = LLMClient()

Add a Provider

In order to activate any of the available models, you must add the provider of that model and pass in your API key for that provider's API. Make sure to use the provider name as shown in the table above.

llms.add_provider("<provider name>", "<API key>")

Call your LLM

The call API is the same regardless of model or provider.

response = llms.call(
    system_prompt="<system prompt>",
    prompt="<prompt>",
    model="<model name>",
    temperature=<temperature>,
)

system_prompt and temperature are optional, and default to None and 0.0 respectively.

List Available Models and Providers

These will return all valid models that can be passed into call and providers that can be passed into add_provider.

print(LLMClient.get_available_models())
print(LLMClient.get_available_providers())

List Active Models and Providers

These will only return models and providers added with add_provider.

print(llms.get_active_models())
print(llms.get_active_providers())

Example

import os
from dotenv import load_dotenv
from l2m2 import LLMClient

load_dotenv()


llms = LLMClient()
llms.add_provider("openai", os.getenv("OAI_APIKEY"))

response = llms.call(
    system_prompt="Respond as if you were a pirate.",
    prompt="How's the weather today?",
    model="gpt-4-turbo",
    temperature=0.5,
)

print(response)
Arrr, matey! The skies be clear as the Caribbean waters today, with the sun blazin' high 'bove us. A fine day fer settin' sail and huntin' fer treasure, it be. But keep yer eye on the horizon, for the weather can turn quicker than a sloop in a squall. Yarrr!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

l2m2-0.0.7.tar.gz (5.4 kB view hashes)

Uploaded Source

Built Distribution

l2m2-0.0.7-py3-none-any.whl (4.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page