Skip to main content

Yet Another LLM Client

Project description

Yet Another LLM Client

An opinionated python wrapper for LLM calls. Supports multiple LLM providers:

  • OpenAI
  • Anthropic
  • more to come...

Uses pydantic models to serialize LLM responses. Every response has to be serialized into a pydantic model.

Full async support.

Checking models

To verify which models in LLMModel are reachable with your current API keys:

cp .env.example .env  # fill in your API keys
uv run python scripts/check_models.py

Each model is called concurrently and results print as they complete.

Usage

Every call to the LLM returns some metadata. Metadata contains token usage, costs, model used and context messages. YALC supports 2 modes of operations for handling metadata.

Metadata return mode

Metadata is returned directly alongside the response as a tuple.

client = create_client(LLMModel.gpt_4o_mini)

result, metadata = await client.structured_response(
    JudgmentResult, messages
)

Advantages:

  • Simple, no setup required
  • Direct access to metadata at the call site

Disadvantages:

  • Must handle metadata manually on every call
  • Easy to forget or handle inconsistently across call sites

Strategy metadata mode

A metadata handler strategy is provided during client creation. The strategy is automatically invoked on every call when a context is passed. The provided context is used for any additional data that needs to be used when handling LLM call metadata.

# 1. Define your strategy
class LogStrategy(ClientMetadataStrategy[LLMLogContext]):
    def handle(self, call: ClientCall, context: LLMLogContext):
        print(f"Tokens: {call.input_tokens + call.output_tokens}")
        print(f"Cost: {call.input_tokens_cost + call.output_tokens_cost}")
        db.save(call.model_dump(), context.request_id)

# 2. Create client with the strategy
client = create_client(LLMModel.gpt_4o_mini, metadata_strategies=[LogStrategy()])

# 3. Pass context to trigger the strategy
result = await client.structured_response(
    JudgmentResult, messages, context=llm_log_context
)

Advantages:

  • Metadata handling is set up once and applied consistently
  • Call sites stay clean — no need to unpack or handle metadata each time

Disadvantages:

  • More initial setup
  • Metadata handling is implicit, which can be harder to trace

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

yalc-0.3.2.tar.gz (110.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

yalc-0.3.2-py3-none-any.whl (9.0 kB view details)

Uploaded Python 3

File details

Details for the file yalc-0.3.2.tar.gz.

File metadata

  • Download URL: yalc-0.3.2.tar.gz
  • Upload date:
  • Size: 110.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for yalc-0.3.2.tar.gz
Algorithm Hash digest
SHA256 71eb3f87e385dd15c589bffa366242dba7ffa357e2bff2c62d1b383e9710bc69
MD5 9a2f1547dd519358552b103d352416c7
BLAKE2b-256 63f3ba1a90f2f35243ca7d0f2b3bb78ddd087c18aabbe7842305098c6ee34a80

See more details on using hashes here.

Provenance

The following attestation bundles were made for yalc-0.3.2.tar.gz:

Publisher: publish.yml on cognitai-labs-dev/yalc

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file yalc-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: yalc-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 9.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for yalc-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 07035b68f180e2f56d4ea249a6a4adbddd531bd7fecf708bfbf1e99ee88c48b1
MD5 269654f61d61e12a8bcd9b3a51ff6b5a
BLAKE2b-256 f4e679373059f254a6917c1dd7220e2a175f231dea17244cea94a29caed7eb8f

See more details on using hashes here.

Provenance

The following attestation bundles were made for yalc-0.3.2-py3-none-any.whl:

Publisher: publish.yml on cognitai-labs-dev/yalc

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page