Skip to main content

Interface to handle multiple LLMs and AI tools.

Project description

llmax

Python package to manage most external and internal LLM APIs fluently.

Installation

To install, run the following command:

python3 -m pip install delos-llmax

How to use

You first have to define a list of Deployment as such, where you need to specify the endpoints, key and deployment_name. Then create the client:

from llmax.clients import MultiAIClient
from llmax.models import Deployment, Model

deployments: dict[Model, Deployment] = {
        "gpt-4o": Deployment(
            model="gpt-4o",
            provider="azure",
            deployment_name="gpt-4o-2024-05-13",
            api_key=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_KEY", ""),
            endpoint=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_ENDPOINT", ""),
        ),
        "whisper-1": Deployment(
            model="whisper-1",
            provider="azure",
            deployment_name="whisper-1",
            api_key=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_KEY", ""),
            endpoint=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_ENDPOINT", ""),
            api_version="2024-02-01",
        ),
    }

client = MultiAIClient(
        deployments=deployments,
    )

Then you should define your input (that can be a text, image or audio, following the openai documentation for instance).

messages = [
        {"role": "user", "content": "Raconte moi une blague."},
    ]

And finally get the response:

response = client.invoke_to_str(messages, model)
print(response)

Specificities

When creating the client, you can also specify two functions, increment_usage and get_usage. The first one is Callable[[float, Model], bool] while the second is Callable[[], float]. increment_usage is a function that is called after a call of the llm. The float is the price and Model, the model used. It can therefore be used to update your database. get_usage returns whether a condition is met. For instance, it can be a function that calls your database and returns whether the user is still active.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

delos_llmax-0.11.1.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

delos_llmax-0.11.1-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file delos_llmax-0.11.1.tar.gz.

File metadata

  • Download URL: delos_llmax-0.11.1.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for delos_llmax-0.11.1.tar.gz
Algorithm Hash digest
SHA256 619d0e12560c238b95844f854b12cc3abea4766854e7096fc03b312c8943238d
MD5 b019f8aded9e1362afbb840abe43b0d5
BLAKE2b-256 7b503335099ee96dba9599a5803cbffaf8784fd8b0f01b4cafe75b58c062fdd8

See more details on using hashes here.

File details

Details for the file delos_llmax-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: delos_llmax-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for delos_llmax-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f661d9db5a127958e5e96ec34846b021a2384f9452451a6fd7897c9ed2fe0ade
MD5 3c696b1a09a82ab3fd7486916b477443
BLAKE2b-256 dc3ba518140e36e07c4f7358f0226690b8d9d83cfbdb8a130abec850a30e46f2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page