Skip to main content

Interface to handle multiple LLMs and AI tools.

Project description

llmax

Python package to manage most external and internal LLM APIs fluently.

Installation

To install, run the following command:

python3 -m pip install delos-llmax

How to use

You first have to define a list of Deployment as such, where you need to specify the endpoints, key and deployment_name. Then create the client:

from llmax.clients import MultiAIClient
from llmax.models import Deployment, Model

deployments: dict[Model, Deployment] = {
        "gpt-4o": Deployment(
            model="gpt-4o",
            provider="azure",
            deployment_name="gpt-4o-2024-05-13",
            api_key=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_KEY", ""),
            endpoint=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_ENDPOINT", ""),
        ),
        "whisper-1": Deployment(
            model="whisper-1",
            provider="azure",
            deployment_name="whisper-1",
            api_key=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_KEY", ""),
            endpoint=os.getenv("LLMAX_AZURE_OPENAI_SWEDENCENTRAL_ENDPOINT", ""),
            api_version="2024-02-01",
        ),
    }

client = MultiAIClient(
        deployments=deployments,
    )

Then you should define your input (that can be a text, image or audio, following the openai documentation for instance).

messages = [
        {"role": "user", "content": "Raconte moi une blague."},
    ]

And finally get the response:

response = client.invoke_to_str(messages, model)
print(response)

Specificities

When creating the client, you can also specify two functions, increment_usage and get_usage. The first one is Callable[[float, Model], bool] while the second is Callable[[], float]. increment_usage is a function that is called after a call of the llm. The float is the price and Model, the model used. It can therefore be used to update your database. get_usage returns whether a condition is met. For instance, it can be a function that calls your database and returns whether the user is still active.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

delos_llmax-0.11.2.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

delos_llmax-0.11.2-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file delos_llmax-0.11.2.tar.gz.

File metadata

  • Download URL: delos_llmax-0.11.2.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for delos_llmax-0.11.2.tar.gz
Algorithm Hash digest
SHA256 1610bc454e39965818b4f28a1635d4a59093b8642a66d9f740a2b6218ba457e6
MD5 fcad215324abc9031157b59297e03e3d
BLAKE2b-256 31f5d805f1d2e42236042a3a0bfa8247c92e2a09c3ff20276c8f711c560f8757

See more details on using hashes here.

File details

Details for the file delos_llmax-0.11.2-py3-none-any.whl.

File metadata

  • Download URL: delos_llmax-0.11.2-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for delos_llmax-0.11.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7755c5c8ed61c81fe3376edc4324ae4e288c9b231d2a09779ed3cb484589da24
MD5 f604be48b845d06f9a11ede478f5f6e5
BLAKE2b-256 09be1011412e15b1e3d6647cebee4451717ecc9429c4772612d1563f891459bc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page