Skip to main content

Giskard LLM Utils

Project description

Giskard LLM Utils

A Python library providing utility functions and tools for working with Large Language Models (LLMs). This library is part of the Giskard ecosystem and provides various utilities for LLM operations, including model management, clustering, and more.

Purpose

This library aims to simplify working with LLMs by providing:

  • A unified interface for different LLM providers through LiteLLM
  • Support for both cloud-based and local embedding models
  • Easy configuration through environment variables or direct initialization
  • Synchronous and asynchronous operations for better performance

Installation

Standard Installation

pip install giskard-lmutils

Local Embedding Support

For local embedding capabilities, install with the local-embedding extra:

pip install "giskard-lmutils[local-embedding]"

This will install the required dependencies (torch and transformers) for running embedding models locally.

Development Installation

  1. Install python, UV and make
  2. Clone this repository
  3. Setup the virtual environment using make setup

Using LiteLLMModel

The LiteLLMModel class provides a unified interface for working with various LLM providers through the LiteLLM library. It supports both completion and embedding operations, with both synchronous and asynchronous methods.

Configuration

You can configure the model in two ways:

  1. Through environment variables:
# Required for OpenAI models
export OPENAI_API_KEY="your-api-key"

# Model configuration
export GSK_COMPLETION_MODEL="gpt-3.5-turbo"
export GSK_EMBEDDING_MODEL="text-embedding-ada-002"
from giskard_lmutils.model import LiteLLMModel

# This will use environment variables for model names
model = LiteLLMModel(
    completion_params={"temperature": 0.7},
    embedding_params={"is_local": False}  # Optional, defaults to False
)

Note: The environment variable prefix can be customized by passing an env_prefix parameter to the LiteLLMModel initialization. This allows you to use different models within the same application by setting different environment variables (e.g., CUSTOM_PREFIX_COMPLETION_MODEL).

  1. Through specified model names:
model = LiteLLMModel(
    completion_model="gpt-3.5-turbo",
    embedding_model="text-embedding-ada-002",
    completion_params={"temperature": 0.7},
    embedding_params={"is_local": False}  # Optional, defaults to False
)

Note: When using OpenAI models, you must set the OPENAI_API_KEY environment variable. For other providers, refer to the LiteLLM documentation for their specific API key requirements.

Usage Examples

Text Completion

# Synchronous completion
response = model.complete([
    {"role": "user", "content": "What is the capital of France?"}
])

# Asynchronous completion
response = await model.acomplete([
    {"role": "user", "content": "What is the capital of France?"}
])

Text Embedding

# Synchronous embedding
embeddings = model.embed(["Hello, world!", "Another text"])

# Asynchronous embedding
embeddings = await model.aembed(["Hello, world!", "Another text"])

# Local embedding
model = LiteLLMModel(
    embedding_model="sentence-transformers/all-MiniLM-L6-v2",
    embedding_params={"is_local": True}
)
embeddings = model.embed(["Hello, world!"])

Requirements

  • Python >= 3.9, < 3.14
  • Core dependencies:
    • numpy >= 2.2.2
    • litellm >= 1.59.3
  • Optional dependencies (for local embedding):
    • torch >= 2.6.0
    • transformers >= 4.51.3

License

This project is licensed under the Apache Software License 2.0 - see the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Authors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

giskard_lmutils-1.0.0.tar.gz (116.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

giskard_lmutils-1.0.0-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file giskard_lmutils-1.0.0.tar.gz.

File metadata

  • Download URL: giskard_lmutils-1.0.0.tar.gz
  • Upload date:
  • Size: 116.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.14

File hashes

Hashes for giskard_lmutils-1.0.0.tar.gz
Algorithm Hash digest
SHA256 442fbf647ed1961770c8b962ab48321663302214c75ceb8fae7a71adcd3a003d
MD5 1c9adb17e884023098c8f182986a33f0
BLAKE2b-256 b14e656251a59dc53e24a0fca708e3d8fb5d451cbea4bbeedc371b9c8e0e0f24

See more details on using hashes here.

File details

Details for the file giskard_lmutils-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for giskard_lmutils-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fc07f1a0f16a0f4bf982751380332570b83170c5d53f8c912c070df5fe17ecb2
MD5 75df41b41abe32dfbfa9b2ac427ab59c
BLAKE2b-256 138d9dac642cdcc12b35fe54c32b19cee189cae4a1a67fb9d4d74dacf2f4f53a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page