Skip to main content

Interface to handle multiple LLMs and AI tools.

Project description

llmax

Python package to manage most external and internal LLM APIs fluently.

Installation

To install, run the following command:

python3 -m pip install delos-llmax

How to use

You first have to define a list of Deployment as such, where you need to specify the endpoints, key and deployment_name. Then create the client:

from llmax.clients import MultiAIClient
from llmax.models import Deployment, Model

deployments: dict[Model, Deployment] = {
        "gpt-4o": Deployment(
            model="gpt-4o",
            provider="azure",
            deployment_name="gpt-4o-2024-05-13",
            api_key=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_KEY", ""),
            endpoint=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_ENDPOINT", ""),
        ),
        "whisper-1": Deployment(
            model="whisper-1",
            provider="azure",
            deployment_name="whisper-1",
            api_key=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_KEY", ""),
            endpoint=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_ENDPOINT", ""),
            api_version="2024-02-01",
        ),
    }

client = MultiAIClient(
        deployments=deployments,
    )

Then you should define your input (that can be a text, image or audio, following the openai documentation for instance).

messages = [
        {"role": "user", "content": "Raconte moi une blague."},
    ]

And finally get the response:

response = client.invoke_to_str(messages, model)
print(response)

Specificities

When creating the client, you can also specify two functions, increment_usage and get_usage. The first one is Callable[[float, Model], bool] while the second is Callable[[], float]. increment_usage is a function that is called after a call of the llm. The float is the price and Model, the model used. It can therefore be used to update your database. get_usage returns whether a condition is met. For instance, it can be a function that calls your database and returns whether the user is still active.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

delos_llmax-0.11.3.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

delos_llmax-0.11.3-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file delos_llmax-0.11.3.tar.gz.

File metadata

  • Download URL: delos_llmax-0.11.3.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for delos_llmax-0.11.3.tar.gz
Algorithm Hash digest
SHA256 6a28ceb479d003bf532240ad47542f70885283c6d00b219f16de6e35af149c7e
MD5 efbbd913165e7211b6cfab1812c6e481
BLAKE2b-256 98c873429b4aeeee24c6c2f57b0ce122c0e50f3de3d9bfe91854758aae7748f8

See more details on using hashes here.

File details

Details for the file delos_llmax-0.11.3-py3-none-any.whl.

File metadata

  • Download URL: delos_llmax-0.11.3-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for delos_llmax-0.11.3-py3-none-any.whl
Algorithm Hash digest
SHA256 975502a5a5a1585296e6fd0076c8040fcd9f0b9c917ca3ed90c356e5252b60b2
MD5 86c5523b2ae5ad6de174adb299e0fccc
BLAKE2b-256 ac5147c4375cdfcae0fe7eaef8a0a3e71e556b68185caebd229953d3b1750824

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page