Skip to main content

Yet Another LLM Client

Project description

Yet Another LLM Client

An opinionated python wrapper for LLM calls. Supports multiple LLM providers:

  • OpenAI
  • Anthropic
  • more to come...

Uses pydantic models to serialize LLM responses. Every response has to be serialized into a pydantic model.

Full async support.

Checking models

To verify which models in LLMModel are reachable with your current API keys:

cp .env.example .env  # fill in your API keys
uv run python scripts/check_models.py

Each model is called concurrently and results print as they complete.

Usage

Every call to the LLM returns some metadata. Metadata contains token usage, costs, model used and context messages. YALC supports 2 modes of operations for handling metadata.

Metadata return mode

Metadata is returned directly alongside the response as a tuple.

client = create_client(LLMModel.gpt_4o_mini)

result, metadata = await client.structured_response(
    JudgmentResult, messages
)

Advantages:

  • Simple, no setup required
  • Direct access to metadata at the call site

Disadvantages:

  • Must handle metadata manually on every call
  • Easy to forget or handle inconsistently across call sites

Strategy metadata mode

A metadata handler strategy is provided during client creation. The strategy is automatically invoked on every call when a context is passed. The provided context is used for any additional data that needs to be used when handling LLM call metadata.

# 1. Define your strategy
class LogStrategy(ClientMetadataStrategy[LLMLogContext]):
    def handle(self, call: ClientCall, context: LLMLogContext):
        print(f"Tokens: {call.input_tokens + call.output_tokens}")
        print(f"Cost: {call.input_tokens_cost + call.output_tokens_cost}")
        db.save(call.model_dump(), context.request_id)

# 2. Create client with the strategy
client = create_client(LLMModel.gpt_4o_mini, metadata_strategies=[LogStrategy()])

# 3. Pass context to trigger the strategy
result = await client.structured_response(
    JudgmentResult, messages, context=llm_log_context
)

Advantages:

  • Metadata handling is set up once and applied consistently
  • Call sites stay clean — no need to unpack or handle metadata each time

Disadvantages:

  • More initial setup
  • Metadata handling is implicit, which can be harder to trace

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

yalc-0.2.2.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

yalc-0.2.2-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file yalc-0.2.2.tar.gz.

File metadata

  • Download URL: yalc-0.2.2.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for yalc-0.2.2.tar.gz
Algorithm Hash digest
SHA256 e0ee641b0ebb911014d6f5a6beb2e591971d5d8e9cccd574a30dff107cf5ba13
MD5 26b23bfbcc58bebff8007e26e342bfae
BLAKE2b-256 9588d34316585df932d9d4a72b9eb1474ead29a0a34c105a6499bbc64189915d

See more details on using hashes here.

Provenance

The following attestation bundles were made for yalc-0.2.2.tar.gz:

Publisher: publish.yml on cognitai-labs-dev/yalc

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file yalc-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: yalc-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for yalc-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 04aba316bac9e131c245bd7f08bdc77eeae04ce23b9dfb75783d839164a2432c
MD5 f1e92116372908fdcf57ebe0c555d188
BLAKE2b-256 defc81163db31a3646d3160dbbc23e6fea45bd37ccba5101ca3bdff7457f28c9

See more details on using hashes here.

Provenance

The following attestation bundles were made for yalc-0.2.2-py3-none-any.whl:

Publisher: publish.yml on cognitai-labs-dev/yalc

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page