Skip to main content

A tiny library for manage calls to the LLMs of different services (Paris-Saclay Aristote Included).

Project description

Unified Model Caller

A small, lightweight library that provides a single unified interface for calling LLMs from different providers. Instead of learning each provider's SDK separately, you instantiate one LLMCaller and swap the service name.

Supported services

Service name Provider
openai OpenAI (GPT models)
anthropic Anthropic (Claude models)
google Google (Gemini models)
xai xAI (Grok models)
ilaas Ilaas
aristoteonmydocker Aristote on MyDocker

Installation

Via pip:

pip install git+https://github.com/DobbiKov/unified-model-caller.git

Via uv:

uv add git+https://github.com/DobbiKov/unified-model-caller.git

Usage

from unified_model_caller import LLMCaller

caller = LLMCaller("google", "gemini-2.0-flash", api_key="<your-api-key>")
response = caller.call("What is a matrix?")
print(response)

The constructor signature is:

LLMCaller(service: str, model: str, api_key: str = "")
  • service — case-insensitive service name (see table above)
  • model — model identifier string passed directly to the provider
  • api_key — API key; can be omitted for services that don't require one

Rate limiting

Call wait_cooldown() between requests to respect each service's built-in cooldown:

caller.wait_cooldown()
response = caller.call("Next prompt")

Listing available services

LLMCaller.get_services()
# ['openai', 'anthropic', 'google', 'xai', 'ilaas', 'aristoteonmydocker']

Adding an external service

You can register a new service at runtime from any Python file — no changes to the library are needed.

1. Create a service file

The file must define a class that inherits from BaseService and implements four methods:

# my_service.py
from unified_model_caller import BaseService

class MyService(BaseService):
    def get_name(self) -> str:
        """Unique lowercase name used to identify this service."""
        return "myservice"

    def requires_token(self) -> bool:
        """Return True if the service needs an API key."""
        return True

    def service_cooldown(self) -> int:
        """Minimum delay between calls, in milliseconds."""
        return 1000

    def call(self, model: str, prompt: str) -> str:
        """Send prompt to the model and return the response text."""
        import requests
        response = requests.post(
            "https://api.myservice.example/v1/completions",
            json={"model": model, "prompt": prompt},
            headers={"Authorization": f"Bearer {self.api_key}"},
        )
        return response.json()["text"]

The api_key passed to LLMCaller(...) is available as self.api_key inside your class.

2. Register and use it

from unified_model_caller import LLMCaller

LLMCaller.add_service("/path/to/my_service.py")

caller = LLMCaller("myservice", "my-model-name", api_key="<your-api-key>")
response = caller.call("Hello!")
print(response)

add_service loads the file, finds the BaseService subclass inside it, and registers it globally under the name returned by get_name(). The service is then available to all subsequent LLMCaller instances in the same process.

BaseService contract

Method Return type Description
get_name(self) str Unique service identifier (lowercase). Used as the service argument to LLMCaller.
requires_token(self) bool Whether the service needs an API key.
service_cooldown(self) int Cooldown between calls in milliseconds.
call(self, model, prompt) str Perform the API call and return the response text.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unified_model_caller-0.2.0.tar.gz (6.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

unified_model_caller-0.2.0-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file unified_model_caller-0.2.0.tar.gz.

File metadata

  • Download URL: unified_model_caller-0.2.0.tar.gz
  • Upload date:
  • Size: 6.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for unified_model_caller-0.2.0.tar.gz
Algorithm Hash digest
SHA256 24d0fa4787e69e3e6aedca6bb5d83244aa67adb1bfcb6cff50b0ae2cbf3ae8f7
MD5 af76b7aa72c46e656f4e55232795c587
BLAKE2b-256 d3b280bcd212f036b0c4e7468a34328b181e716387993ccf947565a56a8f3743

See more details on using hashes here.

Provenance

The following attestation bundles were made for unified_model_caller-0.2.0.tar.gz:

Publisher: release.yml on DobbiKov/unified-model-caller

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file unified_model_caller-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for unified_model_caller-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e90f400d6de44865efd9296b5255ba10293795008f71bd6b94916e1e5e345d3e
MD5 7295291fec8a530f6723c95b7b8f05b5
BLAKE2b-256 7521bde249d34feab90a667d3cdf3287eca8b652bda6dfc18af563574505af3f

See more details on using hashes here.

Provenance

The following attestation bundles were made for unified_model_caller-0.2.0-py3-none-any.whl:

Publisher: release.yml on DobbiKov/unified-model-caller

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page